Sample records for marginal probability distribution

  1. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.

    2012-06-15

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less

  2. Metocean design parameter estimation for fixed platform based on copula functions

    NASA Astrophysics Data System (ADS)

    Zhai, Jinjin; Yin, Qilin; Dong, Sheng

    2017-08-01

    Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.

  3. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  4. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    NASA Astrophysics Data System (ADS)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further assure that the entropy-based joint rainfall-runoff distribution are satisfactorily derived. Overall, the study shows the Shannon entropy theory can be satisfactorily applied to model the dependence between rainfall and runoff. The study also shows that the entropy-based joint distribution is an appropriate approach to capture the dependence structure that cannot be captured by the convenient bivariate joint distributions. Joint Rainfall-Runoff Entropy Based PDF, and Corresponding Marginal PDF and Histogram for W12 Watershed The K-S Test Result and RMSE on Univariate Distributions Derived from the Maximum Entropy Based Joint Probability Distribution;

  5. Quantum Jeffreys prior for displaced squeezed thermal states

    NASA Astrophysics Data System (ADS)

    Kwek, L. C.; Oh, C. H.; Wang, Xiang-Bin

    1999-09-01

    It is known that, by extending the equivalence of the Fisher information matrix to its quantum version, the Bures metric, the quantum Jeffreys prior can be determined from the volume element of the Bures metric. We compute the Bures metric for the displaced squeezed thermal state and analyse the quantum Jeffreys prior and its marginal probability distributions. To normalize the marginal probability density function, it is necessary to provide a range of values of the squeezing parameter or the inverse temperature. We find that if the range of the squeezing parameter is kept narrow, there are significant differences in the marginal probability density functions in terms of the squeezing parameters for the displaced and undisplaced situations. However, these differences disappear as the range increases. Furthermore, marginal probability density functions against temperature are very different in the two cases.

  6. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    ERIC Educational Resources Information Center

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  7. Robust approaches to quantification of margin and uncertainty for sparse data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less

  8. TU-AB-BRB-01: Coverage Evaluation and Probabilistic Treatment Planning as a Margin Alternative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebers, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  9. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  10. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  11. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  12. A non-stationary cost-benefit based bivariate extreme flood estimation approach

    NASA Astrophysics Data System (ADS)

    Qi, Wei; Liu, Junguo

    2018-02-01

    Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.

  13. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach

    PubMed Central

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2018-01-01

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach, and has several attractive features compared to the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, since the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. PMID:26303591

  14. TU-AB-BRB-03: Coverage-Based Treatment Planning to Accommodate Organ Deformable Motions and Contouring Uncertainties for Prostate Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, H.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  15. TU-AB-BRB-02: Stochastic Programming Methods for Handling Uncertainty and Motion in IMRT Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unkelbach, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  16. TU-AB-BRB-00: New Methods to Ensure Target Coverage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  17. Maximum entropy approach to statistical inference for an ocean acoustic waveguide.

    PubMed

    Knobles, D P; Sagers, J D; Koch, R A

    2012-02-01

    A conditional probability distribution suitable for estimating the statistical properties of ocean seabed parameter values inferred from acoustic measurements is derived from a maximum entropy principle. The specification of the expectation value for an error function constrains the maximization of an entropy functional. This constraint determines the sensitivity factor (β) to the error function of the resulting probability distribution, which is a canonical form that provides a conservative estimate of the uncertainty of the parameter values. From the conditional distribution, marginal distributions for individual parameters can be determined from integration over the other parameters. The approach is an alternative to obtaining the posterior probability distribution without an intermediary determination of the likelihood function followed by an application of Bayes' rule. In this paper the expectation value that specifies the constraint is determined from the values of the error function for the model solutions obtained from a sparse number of data samples. The method is applied to ocean acoustic measurements taken on the New Jersey continental shelf. The marginal probability distribution for the values of the sound speed ratio at the surface of the seabed and the source levels of a towed source are examined for different geoacoustic model representations. © 2012 Acoustical Society of America

  18. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    PubMed

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Combined risk assessment of nonstationary monthly water quality based on Markov chain and time-varying copula.

    PubMed

    Shi, Wei; Xia, Jun

    2017-02-01

    Water quality risk management is a global hot research linkage with the sustainable water resource development. Ammonium nitrogen (NH 3 -N) and permanganate index (COD Mn ) as the focus indicators in Huai River Basin, are selected to reveal their joint transition laws based on Markov theory. The time-varying moments model with either time or land cover index as explanatory variables is applied to build the time-varying marginal distributions of water quality time series. Time-varying copula model, which takes the non-stationarity in the marginal distribution and/or the time variation in dependence structure between water quality series into consideration, is constructed to describe a bivariate frequency analysis for NH 3 -N and COD Mn series at the same monitoring gauge. The larger first-order Markov joint transition probability indicates water quality state Class V w , Class IV and Class III will occur easily in the water body of Bengbu Sluice. Both marginal distribution and copula models are nonstationary, and the explanatory variable time yields better performance than land cover index in describing the non-stationarities in the marginal distributions. In modelling the dependence structure changes, time-varying copula has a better fitting performance than the copula with the constant or the time-trend dependence parameter. The largest synchronous encounter risk probability of NH 3 -N and COD Mn simultaneously reaching Class V is 50.61%, while the asynchronous encounter risk probability is largest when NH 3 -N and COD Mn is inferior to class V and class IV water quality standards, respectively.

  20. Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia

    2015-04-26

    The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less

  1. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  2. Diffusion of active chiral particles

    NASA Astrophysics Data System (ADS)

    Sevilla, Francisco J.

    2016-12-01

    The diffusion of chiral active Brownian particles in three-dimensional space is studied analytically, by consideration of the corresponding Fokker-Planck equation for the probability density of finding a particle at position x and moving along the direction v ̂ at time t , and numerically, by the use of Langevin dynamics simulations. The analysis is focused on the marginal probability density of finding a particle at a given location and at a given time (independently of its direction of motion), which is found from an infinite hierarchy of differential-recurrence relations for the coefficients that appear in the multipole expansion of the probability distribution, which contains the whole kinematic information. This approach allows the explicit calculation of the time dependence of the mean-squared displacement and the time dependence of the kurtosis of the marginal probability distribution, quantities from which the effective diffusion coefficient and the "shape" of the positions distribution are examined. Oscillations between two characteristic values were found in the time evolution of the kurtosis, namely, between the value that corresponds to a Gaussian and the one that corresponds to a distribution of spherical shell shape. In the case of an ensemble of particles, each one rotating around a uniformly distributed random axis, evidence is found of the so-called effect "anomalous, yet Brownian, diffusion," for which particles follow a non-Gaussian distribution for the positions yet the mean-squared displacement is a linear function of time.

  3. Technology-Enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like "What is the chance of event A occurring, given that event B was observed?" This generic question arises in discussions of many intriguing scientific questions such as "What is the probability that an adolescent weighs between 120 and 140 pounds given that…

  4. Size distribution of submarine landslides along the U.S. Atlantic margin

    USGS Publications Warehouse

    Chaytor, J.D.; ten Brink, Uri S.; Solow, A.R.; Andrews, B.D.

    2009-01-01

    Assessment of the probability for destructive landslide-generated tsunamis depends on the knowledge of the number, size, and frequency of large submarine landslides. This paper investigates the size distribution of submarine landslides along the U.S. Atlantic continental slope and rise using the size of the landslide source regions (landslide failure scars). Landslide scars along the margin identified in a detailed bathymetric Digital Elevation Model (DEM) have areas that range between 0.89??km2 and 2410??km2 and volumes between 0.002??km3 and 179??km3. The area to volume relationship of these failure scars is almost linear (inverse power-law exponent close to 1), suggesting a fairly uniform failure thickness of a few 10s of meters in each event, with only rare, deep excavating landslides. The cumulative volume distribution of the failure scars is very well described by a log-normal distribution rather than by an inverse power-law, the most commonly used distribution for both subaerial and submarine landslides. A log-normal distribution centered on a volume of 0.86??km3 may indicate that landslides preferentially mobilize a moderate amount of material (on the order of 1??km3), rather than large landslides or very small ones. Alternatively, the log-normal distribution may reflect an inverse power law distribution modified by a size-dependent probability of observing landslide scars in the bathymetry data. If the latter is the case, an inverse power-law distribution with an exponent of 1.3 ?? 0.3, modified by a size-dependent conditional probability of identifying more failure scars with increasing landslide size, fits the observed size distribution. This exponent value is similar to the predicted exponent of 1.2 ?? 0.3 for subaerial landslides in unconsolidated material. Both the log-normal and modified inverse power-law distributions of the observed failure scar volumes suggest that large landslides, which have the greatest potential to generate damaging tsunamis, occur infrequently along the margin. ?? 2008 Elsevier B.V.

  5. An assessment of PTV margin based on actual accumulated dose for prostate cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Wen, Ning; Kumarasiri, Akila; Nurushev, Teamour; Burmeister, Jay; Xing, Lei; Liu, Dezhi; Glide-Hurst, Carri; Kim, Jinkoo; Zhong, Hualiang; Movsas, Benjamin; Chetty, Indrin J.

    2013-11-01

    The purpose of this work is to present the results of a margin reduction study involving dosimetric and radiobiologic assessment of cumulative dose distributions, computed using an image guided adaptive radiotherapy based framework. Eight prostate cancer patients, treated with 7-9, 6 MV, intensity modulated radiation therapy (IMRT) fields, were included in this study. The workflow consists of cone beam CT (CBCT) based localization, deformable image registration of the CBCT to simulation CT image datasets (SIM-CT), dose reconstruction and dose accumulation on the SIM-CT, and plan evaluation using radiobiological models. For each patient, three IMRT plans were generated with different margins applied to the CTV. The PTV margin for the original plan was 10 mm and 6 mm at the prostate/anterior rectal wall interface (10/6 mm) and was reduced to: (a) 5/3 mm, and (b) 3 mm uniformly. The average percent reductions in predicted tumor control probability (TCP) in the accumulated (actual) plans in comparison to the original plans over eight patients were 0.4%, 0.7% and 11.0% with 10/6 mm, 5/3 mm and 3 mm uniform margin respectively. The mean increase in predicted normal tissue complication probability (NTCP) for grades 2/3 rectal bleeding for the actual plans in comparison to the static plans with margins of 10/6, 5/3 and 3 mm uniformly was 3.5%, 2.8% and 2.4% respectively. For the actual dose distributions, predicted NTCP for late rectal bleeding was reduced by 3.6% on average when the margin was reduced from 10/6 mm to 5/3 mm, and further reduced by 1.0% on average when the margin was reduced to 3 mm. The average reduction in complication free tumor control probability (P+) in the actual plans in comparison to the original plans with margins of 10/6, 5/3 and 3 mm was 3.7%, 2.4% and 13.6% correspondingly. The significant reduction of TCP and P+ in the actual plan with 3 mm margin came from one outlier, where individualizing patient treatment plans through margin adaptation based on biological models, might yield higher quality treatments.

  6. Computation of marginal distributions of peak-heights in electropherograms for analysing single source and mixture STR DNA samples.

    PubMed

    Cowell, Robert G

    2018-05-04

    Current models for single source and mixture samples, and probabilistic genotyping software based on them used for analysing STR electropherogram data, assume simple probability distributions, such as the gamma distribution, to model the allelic peak height variability given the initial amount of DNA prior to PCR amplification. Here we illustrate how amplicon number distributions, for a model of the process of sample DNA collection and PCR amplification, may be efficiently computed by evaluating probability generating functions using discrete Fourier transforms. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. On the inequivalence of the CH and CHSH inequalities due to finite statistics

    NASA Astrophysics Data System (ADS)

    Renou, M. O.; Rosset, D.; Martin, A.; Gisin, N.

    2017-06-01

    Different variants of a Bell inequality, such as CHSH and CH, are known to be equivalent when evaluated on nonsignaling outcome probability distributions. However, in experimental setups, the outcome probability distributions are estimated using a finite number of samples. Therefore the nonsignaling conditions are only approximately satisfied and the robustness of the violation depends on the chosen inequality variant. We explain that phenomenon using the decomposition of the space of outcome probability distributions under the action of the symmetry group of the scenario, and propose a method to optimize the statistical robustness of a Bell inequality. In the process, we describe the finite group composed of relabeling of parties, measurement settings and outcomes, and identify correspondences between the irreducible representations of this group and properties of outcome probability distributions such as normalization, signaling or having uniform marginals.

  8. Classic maximum entropy recovery of the average joint distribution of apparent FRET efficiency and fluorescence photons for single-molecule burst measurements.

    PubMed

    DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2012-04-05

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.

  9. Two tandem queues with general renewal input. 2: Asymptotic expansions for the diffusion model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knessl, C.; Tier, C.

    1999-10-01

    In Part 1 the authors formulated and solved a diffusion model for two tandem queues with exponential servers and general renewal arrivals. They thus obtained the easy traffic diffusion approximation to the steady state joint queue length distribution for this network. Here they study asymptotic and numerical properties of the diffusion approximation. In particular, analytical expressions are obtained for the tail probabilities. Both the joint distribution of the two queues and the marginal distribution of the second queue are considered. They also give numerical illustrations of how this marginal is affected by changes in the arrival and service processes.

  10. Tomographic measurement of joint photon statistics of the twin-beam quantum state

    PubMed

    Vasilyev; Choi; Kumar; D'Ariano

    2000-03-13

    We report the first measurement of the joint photon-number probability distribution for a two-mode quantum state created by a nondegenerate optical parametric amplifier. The measured distributions exhibit up to 1.9 dB of quantum correlation between the signal and idler photon numbers, whereas the marginal distributions are thermal as expected for parametric fluorescence.

  11. Ecological and evolutionary processes at expanding range margins.

    PubMed

    Thomas, C D; Bodsworth, E J; Wilson, R J; Simmons, A D; Davies, Z G; Musche, M; Conradt, L

    2001-05-31

    Many animals are regarded as relatively sedentary and specialized in marginal parts of their geographical distributions. They are expected to be slow at colonizing new habitats. Despite this, the cool margins of many species' distributions have expanded rapidly in association with recent climate warming. We examined four insect species that have expanded their geographical ranges in Britain over the past 20 years. Here we report that two butterfly species have increased the variety of habitat types that they can colonize, and that two bush cricket species show increased fractions of longer-winged (dispersive) individuals in recently founded populations. Both ecological and evolutionary processes are probably responsible for these changes. Increased habitat breadth and dispersal tendencies have resulted in about 3- to 15-fold increases in expansion rates, allowing these insects to cross habitat disjunctions that would have represented major or complete barriers to dispersal before the expansions started. The emergence of dispersive phenotypes will increase the speed at which species invade new environments, and probably underlies the responses of many species to both past and future climate change.

  12. Classic Maximum Entropy Recovery of the Average Joint Distribution of Apparent FRET Efficiency and Fluorescence Photons for Single-molecule Burst Measurements

    PubMed Central

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2012-01-01

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694

  13. Dosimetric and radiobiological consequences of computed tomography-guided adaptive strategies for intensity modulated radiation therapy of the prostate.

    PubMed

    Battista, Jerry J; Johnson, Carol; Turnbull, David; Kempe, Jeff; Bzdusek, Karl; Van Dyk, Jacob; Bauman, Glenn

    2013-12-01

    To examine a range of scenarios for image-guided adaptive radiation therapy of prostate cancer, including different schedules for megavoltage CT imaging, patient repositioning, and dose replanning. We simulated multifraction dose distributions with deformable registration using 35 sets of megavoltage CT scans of 13 patients. We computed cumulative dose-volume histograms, from which tumor control probabilities and normal tissue complication probabilities (NTCPs) for rectum were calculated. Five-field intensity modulated radiation therapy (IMRT) with 18-MV x-rays was planned to achieve an isocentric dose of 76 Gy to the clinical target volume (CTV). The differences between D95, tumor control probability, V70Gy, and NTCP for rectum, for accumulated versus planned dose distributions, were compared for different target volume sizes, margins, and adaptive strategies. The CTV D95 for IMRT treatment plans, averaged over 13 patients, was 75.2 Gy. Using the largest CTV margins (10/7 mm), the D95 values accumulated over 35 fractions were within 2% of the planned value, regardless of the adaptive strategy used. For tighter margins (5 mm), the average D95 values dropped to approximately 73.0 Gy even with frequent repositioning, and daily replanning was necessary to correct this deficit. When personalized margins were applied to an adaptive CTV derived from the first 6 treatment fractions using the STAPLE (Simultaneous Truth and Performance Level Estimation) algorithm, target coverage could be maintained using a single replan 1 week into therapy. For all approaches, normal tissue parameters (rectum V(70Gy) and NTCP) remained within acceptable limits. The frequency of adaptive interventions depends on the size of the CTV combined with target margins used during IMRT optimization. The application of adaptive target margins (<5 mm) to an adaptive CTV determined 1 week into therapy minimizes the need for subsequent dose replanning. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  15. Local linear estimation of concordance probability with application to covariate effects models on association for bivariate failure-time data.

    PubMed

    Ding, Aidong Adam; Hsieh, Jin-Jian; Wang, Weijing

    2015-01-01

    Bivariate survival analysis has wide applications. In the presence of covariates, most literature focuses on studying their effects on the marginal distributions. However covariates can also affect the association between the two variables. In this article we consider the latter issue by proposing a nonstandard local linear estimator for the concordance probability as a function of covariates. Under the Clayton copula, the conditional concordance probability has a simple one-to-one correspondence with the copula parameter for different data structures including those subject to independent or dependent censoring and dependent truncation. The proposed method can be used to study how covariates affect the Clayton association parameter without specifying marginal regression models. Asymptotic properties of the proposed estimators are derived and their finite-sample performances are examined via simulations. Finally, for illustration, we apply the proposed method to analyze a bone marrow transplant data set.

  16. Living with marginal coral communities: Diversity and host-specificity in coral-associated barnacles in the northern coral distribution limit of the East China Sea.

    PubMed

    Chan, Benny K K; Xu, Guang; Kim, Hyun Kyong; Park, Jin-Ho; Kim, Won

    2018-01-01

    Corals and their associated fauna are extremely diverse in tropical waters and form major reefs. In the high-latitude temperate zone, corals living near their distribution limit are considered marginal communities because they are particularly extremely sensitive to environmental and climatic changes. In this study, we examined the diversity and host usage of coral-associated barnacles on Jeju Island, Korea, the northern coral distribution limit in the East China Sea. In this study, only three coral-associated barnacles-from two genera in two subfamilies-were collected. The Pyrgomatinid barnacles Cantellius arcuatus and Cantellius cf. euspinulosum were found only on the corals Montipora millepora and Alveopora japonica, respectively. The Megatrematinid barnacle Pyrgomina oulastreae, relatively a generalist, was found on Psammocora spp. (both profundacella and albopicta) and Oulastrea crispata corals. The host usage of these three barnacles does not overlap. DNA barcode sequences of the C. arcuatus specimens collected in the present study matched those collected in Kochi in Japan, Taiwan, Malaysia and Papua New Guinea, suggesting that this species has a wide geographical distribution. C. arcuatus covers a wider host range in Taiwan waters, inhabiting Montipora spp. and Porites spp., which suggests that the host specificity of coral-associated barnacles varies with host availability. C. cf. euspinulosum probably has a very narrow distribution and host usage. The sequences of C. cf. euspinulosum on Jeju Island do not match those of any known sequences of Cantellius barnacles in the Indo-Pacific region. P. oulastreae probably prefers cold water because it has been reported in temperate regions. Coral-associated barnacles in marginal communities have considerably lower diversity than their subtropical and tropical counterparts. When host availability is limited, marginal coral-associated barnacles exhibit higher host specificity than those in subtropical and tropical reef systems.

  17. Appearance of De Geer moraines in southern and western Finland - Implications for reconstructing glacier retreat dynamics

    NASA Astrophysics Data System (ADS)

    Ojala, Antti E. K.

    2016-02-01

    LiDAR digital elevation models (DEMs) from southern and western Finland were investigated to map and discriminate features of De Geer moraines, sparser and more scattered end moraines, and larger end moraine features (i.e., ice-marginal complexes). The results indicate that the occurrence and distribution of De Geer moraines and scattered end moraine ridges in Finland are more widespread than previously suggested. This is probably attributed to the ease of detecting and mapping these features with high-resolution DEMs, indicating the efficiency of LiDAR applications in geological and geomorphological studies. The variable appearance and distribution of moraine ridges in Finland support previous interpretations that no single model is likely to be appropriate for the genesis of De Geer moraines at all localities and for all types of end moraines. De Geer moraine appearances and interdistances probably result from a combination of the general rapidity of ice-margin recession during deglaciation, the proglacial water depth in which they were formed, and local glacier dynamics related to climate and terrain topography. The correlation between the varved clay-based rate of deglaciation and interdistances of distinct and regularly spaced De Geer moraine ridges indicates that the rate of deglaciation is probably involved in the De Geer ridge-forming process, but more thorough comparisons are needed to understand the extent to which De Geer interdistances represent an annual rate of ice-margin decay and the rapidity of regional deglaciation.

  18. Is Einsteinian no-signalling violated in Bell tests?

    NASA Astrophysics Data System (ADS)

    Kupczynski, Marian

    2017-11-01

    Relativistic invariance is a physical law verified in several domains of physics. The impossibility of faster than light influences is not questioned by quantum theory. In quantum electrodynamics, in quantum field theory and in the standard model relativistic invariance is incorporated by construction. Quantum mechanics predicts strong long range correlations between outcomes of spin projection measurements performed in distant laboratories. In spite of these strong correlations marginal probability distributions should not depend on what was measured in the other laboratory what is called shortly: non-signalling. In several experiments, performed to test various Bell-type inequalities, some unexplained dependence of empirical marginal probability distributions on distant settings was observed. In this paper we demonstrate how a particular identification and selection procedure of paired distant outcomes is the most probable cause for this apparent violation of no-signalling principle. Thus this unexpected setting dependence does not prove the existence of superluminal influences and Einsteinian no-signalling principle has to be tested differently in dedicated experiments. We propose a detailed protocol telling how such experiments should be designed in order to be conclusive. We also explain how magical quantum correlations may be explained in a locally causal way.

  19. Bayesian source tracking via focalization and marginalization in an uncertain Mediterranean Sea environment.

    PubMed

    Dosso, Stan E; Wilmut, Michael J; Nielsen, Peter L

    2010-07-01

    This paper applies Bayesian source tracking in an uncertain environment to Mediterranean Sea data, and investigates the resulting tracks and track uncertainties as a function of data information content (number of data time-segments, number of frequencies, and signal-to-noise ratio) and of prior information (environmental uncertainties and source-velocity constraints). To track low-level sources, acoustic data recorded for multiple time segments (corresponding to multiple source positions along the track) are inverted simultaneously. Environmental uncertainty is addressed by including unknown water-column and seabed properties as nuisance parameters in an augmented inversion. Two approaches are considered: Focalization-tracking maximizes the posterior probability density (PPD) over the unknown source and environmental parameters. Marginalization-tracking integrates the PPD over environmental parameters to obtain a sequence of joint marginal probability distributions over source coordinates, from which the most-probable track and track uncertainties can be extracted. Both approaches apply track constraints on the maximum allowable vertical and radial source velocity. The two approaches are applied for towed-source acoustic data recorded at a vertical line array at a shallow-water test site in the Mediterranean Sea where previous geoacoustic studies have been carried out.

  20. Kinetic energy as functional of the correlation hole

    NASA Astrophysics Data System (ADS)

    Nalewajski, Roman F.

    2003-01-01

    Using the marginal decomposition of the many-body probability distribution the electronic kinetic energy is expressed as the functional of the electron density and correlation hole. The analysis covers both the molecule as a whole and its constituent subsystems. The importance of the Fisher information for locality is emphasized.

  1. Bayesian multiple-source localization in an uncertain ocean environment.

    PubMed

    Dosso, Stan E; Wilmut, Michael J

    2011-06-01

    This paper considers simultaneous localization of multiple acoustic sources when properties of the ocean environment (water column and seabed) are poorly known. A Bayesian formulation is developed in which the environmental parameters, noise statistics, and locations and complex strengths (amplitudes and phases) of multiple sources are considered to be unknown random variables constrained by acoustic data and prior information. Two approaches are considered for estimating source parameters. Focalization maximizes the posterior probability density (PPD) over all parameters using adaptive hybrid optimization. Marginalization integrates the PPD using efficient Markov-chain Monte Carlo methods to produce joint marginal probability distributions for source ranges and depths, from which source locations are obtained. This approach also provides quantitative uncertainty analysis for all parameters, which can aid in understanding of the inverse problem and may be of practical interest (e.g., source-strength probability distributions). In both approaches, closed-form maximum-likelihood expressions for source strengths and noise variance at each frequency allow these parameters to be sampled implicitly, substantially reducing the dimensionality and difficulty of the inversion. Examples are presented of both approaches applied to single- and multi-frequency localization of multiple sources in an uncertain shallow-water environment, and a Monte Carlo performance evaluation study is carried out. © 2011 Acoustical Society of America

  2. Unified theory for stochastic modelling of hydroclimatic processes: Preserving marginal distributions, correlation structures, and intermittency

    NASA Astrophysics Data System (ADS)

    Papalexiou, Simon Michael

    2018-05-01

    Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.

  3. Generalized Tumor Dose for Treatment Planning Decision Support

    NASA Astrophysics Data System (ADS)

    Zuniga, Areli A.

    Modern radiation therapy techniques allow for improved target conformity and normal tissue sparing. These highly conformal treatment plans have allowed dose escalation techniques increasing the probability of tumor control. At the same time this conformation has introduced inhomogeneous dose distributions, making delivered dose characterizations more difficult. The concept of equivalent uniform dose (EUD) characterizes a heterogeneous dose distribution within irradiated structures as a single value and has been used in biologically based treatment planning (BBTP); however, there are no substantial validation studies on clinical outcome data supporting EUD's use and therefore has not been widely adopted as decision-making support. These highly conformal treatment plans have also introduced the need for safety margins around the target volume. These margins are designed to minimize geometrical misses, and to compensate for dosimetric and treatment delivery uncertainties. The margin's purpose is to reduce the chance of tumor recurrence. This dissertation introduces a new EUD formulation designed especially for tumor volumes, called generalized Tumor Dose (gTD). It also investigates, as a second objective, margins extensions for potential improvements in local control while maintaining or minimizing toxicity. The suitability of gTD to rank LC was assessed by means of retrospective studies in a head and neck (HN) squamous cell carcinoma (SCC) and non-small cell lung cancer (NSCLC) cohorts. The formulation was optimized based on two datasets (one of each type) and then, model validation was assessed on independent cohorts. The second objective of this dissertation was investigated by ranking the probability of LC of the primary disease adding different margin sizes. In order to do so, an already published EUD formula was used retrospectively in a HN and a NSCLC datasets. Finally, recommendations for the viability to implement this new formulation into a routine treatment planning process as well as the revision of safety margins to improve local tumor control maximizing normal tissue sparing in SCC of the HN and NSCLC are discussed.

  4. Small-Scale Spatio-Temporal Distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) Using Probability Kriging.

    PubMed

    Wang, S Q; Zhang, H Y; Li, Z L

    2016-10-01

    Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.

  5. Sensitivity of postplanning target and OAR coverage estimates to dosimetric margin distribution sampling parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu Huijun; Gordon, J. James; Siebers, Jeffrey V.

    2011-02-15

    Purpose: A dosimetric margin (DM) is the margin in a specified direction between a structure and a specified isodose surface, corresponding to a prescription or tolerance dose. The dosimetric margin distribution (DMD) is the distribution of DMs over all directions. Given a geometric uncertainty model, representing inter- or intrafraction setup uncertainties or internal organ motion, the DMD can be used to calculate coverage Q, which is the probability that a realized target or organ-at-risk (OAR) dose metric D{sub v} exceeds the corresponding prescription or tolerance dose. Postplanning coverage evaluation quantifies the percentage of uncertainties for which target and OAR structuresmore » meet their intended dose constraints. The goal of the present work is to evaluate coverage probabilities for 28 prostate treatment plans to determine DMD sampling parameters that ensure adequate accuracy for postplanning coverage estimates. Methods: Normally distributed interfraction setup uncertainties were applied to 28 plans for localized prostate cancer, with prescribed dose of 79.2 Gy and 10 mm clinical target volume to planning target volume (CTV-to-PTV) margins. Using angular or isotropic sampling techniques, dosimetric margins were determined for the CTV, bladder and rectum, assuming shift invariance of the dose distribution. For angular sampling, DMDs were sampled at fixed angular intervals {omega} (e.g., {omega}=1 deg., 2 deg., 5 deg., 10 deg., 20 deg.). Isotropic samples were uniformly distributed on the unit sphere resulting in variable angular increments, but were calculated for the same number of sampling directions as angular DMDs, and accordingly characterized by the effective angular increment {omega}{sub eff}. In each direction, the DM was calculated by moving the structure in radial steps of size {delta}(=0.1,0.2,0.5,1 mm) until the specified isodose was crossed. Coverage estimation accuracy {Delta}Q was quantified as a function of the sampling parameters {omega} or {omega}{sub eff} and {delta}. Results: The accuracy of coverage estimates depends on angular and radial DMD sampling parameters {omega} or {omega}{sub eff} and {delta}, as well as the employed sampling technique. Target |{Delta}Q|<1% and OAR |{Delta}Q|<3% can be achieved with sampling parameters {omega} or {omega}{sub eff}=20 deg., {delta}=1 mm. Better accuracy (target |{Delta}Q|<0.5% and OAR |{Delta}Q|<{approx}1%) can be achieved with {omega} or {omega}{sub eff}=10 deg., {delta}=0.5 mm. As the number of sampling points decreases, the isotropic sampling method maintains better accuracy than fixed angular sampling. Conclusions: Coverage estimates for post-planning evaluation are essential since coverage values of targets and OARs often differ from the values implied by the static margin-based plans. Finer sampling of the DMD enables more accurate assessment of the effect of geometric uncertainties on coverage estimates prior to treatment. DMD sampling with {omega} or {omega}{sub eff}=10 deg. and {delta}=0.5 mm should be adequate for planning purposes.« less

  6. Sensitivity of postplanning target and OAR coverage estimates to dosimetric margin distribution sampling parameters.

    PubMed

    Xu, Huijun; Gordon, J James; Siebers, Jeffrey V

    2011-02-01

    A dosimetric margin (DM) is the margin in a specified direction between a structure and a specified isodose surface, corresponding to a prescription or tolerance dose. The dosimetric margin distribution (DMD) is the distribution of DMs over all directions. Given a geometric uncertainty model, representing inter- or intrafraction setup uncertainties or internal organ motion, the DMD can be used to calculate coverage Q, which is the probability that a realized target or organ-at-risk (OAR) dose metric D, exceeds the corresponding prescription or tolerance dose. Postplanning coverage evaluation quantifies the percentage of uncertainties for which target and OAR structures meet their intended dose constraints. The goal of the present work is to evaluate coverage probabilities for 28 prostate treatment plans to determine DMD sampling parameters that ensure adequate accuracy for postplanning coverage estimates. Normally distributed interfraction setup uncertainties were applied to 28 plans for localized prostate cancer, with prescribed dose of 79.2 Gy and 10 mm clinical target volume to planning target volume (CTV-to-PTV) margins. Using angular or isotropic sampling techniques, dosimetric margins were determined for the CTV, bladder and rectum, assuming shift invariance of the dose distribution. For angular sampling, DMDs were sampled at fixed angular intervals w (e.g., w = 1 degree, 2 degrees, 5 degrees, 10 degrees, 20 degrees). Isotropic samples were uniformly distributed on the unit sphere resulting in variable angular increments, but were calculated for the same number of sampling directions as angular DMDs, and accordingly characterized by the effective angular increment omega eff. In each direction, the DM was calculated by moving the structure in radial steps of size delta (=0.1, 0.2, 0.5, 1 mm) until the specified isodose was crossed. Coverage estimation accuracy deltaQ was quantified as a function of the sampling parameters omega or omega eff and delta. The accuracy of coverage estimates depends on angular and radial DMD sampling parameters omega or omega eff and delta, as well as the employed sampling technique. Target deltaQ/ < l% and OAR /deltaQ/ < 3% can be achieved with sampling parameters omega or omega eef = 20 degrees, delta =1 mm. Better accuracy (target /deltaQ < 0.5% and OAR /deltaQ < approximately 1%) can be achieved with omega or omega eff = 10 degrees, delta = 0.5 mm. As the number of sampling points decreases, the isotropic sampling method maintains better accuracy than fixed angular sampling. Coverage estimates for post-planning evaluation are essential since coverage values of targets and OARs often differ from the values implied by the static margin-based plans. Finer sampling of the DMD enables more accurate assessment of the effect of geometric uncertainties on coverage estimates prior to treatment. DMD sampling with omega or omega eff = 10 degrees and delta = 0.5 mm should be adequate for planning purposes.

  7. Product plots.

    PubMed

    Wickham, Hadley; Hofmann, Heike

    2011-12-01

    We propose a new framework for visualising tables of counts, proportions and probabilities. We call our framework product plots, alluding to the computation of area as a product of height and width, and the statistical concept of generating a joint distribution from the product of conditional and marginal distributions. The framework, with extensions, is sufficient to encompass over 20 visualisations previously described in fields of statistical graphics and infovis, including bar charts, mosaic plots, treemaps, equal area plots and fluctuation diagrams. © 2011 IEEE

  8. Optimal moment determination in POME-copula based hydrometeorological dependence modelling

    NASA Astrophysics Data System (ADS)

    Liu, Dengfeng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi

    2017-07-01

    Copula has been commonly applied in multivariate modelling in various fields where marginal distribution inference is a key element. To develop a flexible, unbiased mathematical inference framework in hydrometeorological multivariate applications, the principle of maximum entropy (POME) is being increasingly coupled with copula. However, in previous POME-based studies, determination of optimal moment constraints has generally not been considered. The main contribution of this study is the determination of optimal moments for POME for developing a coupled optimal moment-POME-copula framework to model hydrometeorological multivariate events. In this framework, margins (marginals, or marginal distributions) are derived with the use of POME, subject to optimal moment constraints. Then, various candidate copulas are constructed according to the derived margins, and finally the most probable one is determined, based on goodness-of-fit statistics. This optimal moment-POME-copula framework is applied to model the dependence patterns of three types of hydrometeorological events: (i) single-site streamflow-water level; (ii) multi-site streamflow; and (iii) multi-site precipitation, with data collected from Yichang and Hankou in the Yangtze River basin, China. Results indicate that the optimal-moment POME is more accurate in margin fitting and the corresponding copulas reflect a good statistical performance in correlation simulation. Also, the derived copulas, capturing more patterns which traditional correlation coefficients cannot reflect, provide an efficient way in other applied scenarios concerning hydrometeorological multivariate modelling.

  9. Objectified quantification of uncertainties in Bayesian atmospheric inversions

    NASA Astrophysics Data System (ADS)

    Berchet, A.; Pison, I.; Chevallier, F.; Bousquet, P.; Bonne, J.-L.; Paris, J.-D.

    2015-05-01

    Classical Bayesian atmospheric inversions process atmospheric observations and prior emissions, the two being connected by an observation operator picturing mainly the atmospheric transport. These inversions rely on prescribed errors in the observations, the prior emissions and the observation operator. When data pieces are sparse, inversion results are very sensitive to the prescribed error distributions, which are not accurately known. The classical Bayesian framework experiences difficulties in quantifying the impact of mis-specified error distributions on the optimized fluxes. In order to cope with this issue, we rely on recent research results to enhance the classical Bayesian inversion framework through a marginalization on a large set of plausible errors that can be prescribed in the system. The marginalization consists in computing inversions for all possible error distributions weighted by the probability of occurrence of the error distributions. The posterior distribution of the fluxes calculated by the marginalization is not explicitly describable. As a consequence, we carry out a Monte Carlo sampling based on an approximation of the probability of occurrence of the error distributions. This approximation is deduced from the well-tested method of the maximum likelihood estimation. Thus, the marginalized inversion relies on an automatic objectified diagnosis of the error statistics, without any prior knowledge about the matrices. It robustly accounts for the uncertainties on the error distributions, contrary to what is classically done with frozen expert-knowledge error statistics. Some expert knowledge is still used in the method for the choice of an emission aggregation pattern and of a sampling protocol in order to reduce the computation cost. The relevance and the robustness of the method is tested on a case study: the inversion of methane surface fluxes at the mesoscale with virtual observations on a realistic network in Eurasia. Observing system simulation experiments are carried out with different transport patterns, flux distributions and total prior amounts of emitted methane. The method proves to consistently reproduce the known "truth" in most cases, with satisfactory tolerance intervals. Additionally, the method explicitly provides influence scores and posterior correlation matrices. An in-depth interpretation of the inversion results is then possible. The more objective quantification of the influence of the observations on the fluxes proposed here allows us to evaluate the impact of the observation network on the characterization of the surface fluxes. The explicit correlations between emission aggregates reveal the mis-separated regions, hence the typical temporal and spatial scales the inversion can analyse. These scales are consistent with the chosen aggregation patterns.

  10. Learning multivariate distributions by competitive assembly of marginals.

    PubMed

    Sánchez-Vega, Francisco; Younes, Laurent; Geman, Donald

    2013-02-01

    We present a new framework for learning high-dimensional multivariate probability distributions from estimated marginals. The approach is motivated by compositional models and Bayesian networks, and designed to adapt to small sample sizes. We start with a large, overlapping set of elementary statistical building blocks, or "primitives," which are low-dimensional marginal distributions learned from data. Each variable may appear in many primitives. Subsets of primitives are combined in a Lego-like fashion to construct a probabilistic graphical model; only a small fraction of the primitives will participate in any valid construction. Since primitives can be precomputed, parameter estimation and structure search are separated. Model complexity is controlled by strong biases; we adapt the primitives to the amount of training data and impose rules which restrict the merging of them into allowable compositions. The likelihood of the data decomposes into a sum of local gains, one for each primitive in the final structure. We focus on a specific subclass of networks which are binary forests. Structure optimization corresponds to an integer linear program and the maximizing composition can be computed for reasonably large numbers of variables. Performance is evaluated using both synthetic data and real datasets from natural language processing and computational biology.

  11. Holocene re-colonisation, central-marginal distribution and habitat specialisation shape population genetic patterns within an Atlantic European grass species.

    PubMed

    Harter, D E V; Jentsch, A; Durka, W

    2015-05-01

    Corynephorus canescens (L.) P.Beauv. is an outbreeding, short-lived and wind-dispersed grass species, highly specialised on scattered and disturbance-dependent habitats of open sandy sites. Its distribution ranges from the Iberian Peninsula over Atlantic regions of Western and Central Europe, but excludes the two other classical European glacial refuge regions on the Apennine and Balkan Peninsulas. To investigate genetic patterns of this uncommon combination of ecological and biogeographic species characteristics, we analysed AFLP variation among 49 populations throughout the European distribution range, expecting (i) patterns of SW European glacial refugia and post-glacial expansion to the NE; (ii) decreasing genetic diversity from central to marginal populations; and (iii) interacting effects of high gene flow and disturbance-driven genetic drift. Decreasing genetic diversity from SW to NE and distinct gene pool clustering imply refugia on the Iberian Peninsula and in western France, from where range expansion originated towards the NE. High genetic diversity within and moderate genetic differentiation among populations, and a significant pattern of isolation-by-distance indicate a gene flow drift equilibrium within C. canescens, probably due to its restriction to scattered and dynamic habitats and limited dispersal distances. These features, as well as the re-colonisation history, were found to affect genetic diversity gradients from central to marginal populations. Our study emphasises the need for including the specific ecology into analyses of species (re-)colonisation histories and range centre-margin analyses. To account for discontinuous distributions, new indices of marginality were tested for their suitability in studies of centre-periphery gradients. © 2014 German Botanical Society and The Royal Botanical Society of the Netherlands.

  12. Effect of Patient Set-up and Respiration motion on Defining Biological Targets for Image-Guided Targeted Radiotherapy

    NASA Astrophysics Data System (ADS)

    McCall, Keisha C.

    Identification and monitoring of sub-tumor targets will be a critical step for optimal design and evaluation of cancer therapies in general and biologically targeted radiotherapy (dose-painting) in particular. Quantitative PET imaging may be an important tool for these applications. Currently radiotherapy planning accounts for tumor motion by applying geometric margins. These margins create a motion envelope to encompass the most probable positions of the tumor, while also maintaining the appropriate tumor control and normal tissue complication probabilities. This motion envelope is effective for uniform dose prescriptions where the therapeutic dose is conformed to the external margins of the tumor. However, much research is needed to establish the equivalent margins for non-uniform fields, where multiple biological targets are present and each target is prescribed its own dose level. Additionally, the size of the biological targets and close proximity make it impractical to apply planning margins on the sub-tumor level. Also, the extent of high dose regions must be limited to avoid excessive dose to the surrounding tissue. As such, this research project is an investigation of the uncertainty within quantitative PET images of moving and displaced dose-painting targets, and an investigation of the residual errors that remain after motion management. This included characterization of the changes in PET voxel-values as objects are moved relative to the discrete sampling interval of PET imaging systems (SPECIFIC AIM 1). Additionally, the repeatability of PET distributions and the delineating dose-painting targets were measured (SPECIFIC AIM 2). The effect of imaging uncertainty on the dose distributions designed using these images (SPECIFIC AIM 3) has also been investigated. This project also included analysis of methods to minimize motion during PET imaging and reduce the dosimetric impact of motion/position-induced imaging uncertainty (SPECIFIC AIM 4).

  13. Tumor control probability reduction in gated radiotherapy of non-small cell lung cancers: a feasibility study.

    PubMed

    Siochi, R Alfredo; Kim, Yusung; Bhatia, Sudershan

    2014-10-16

    We studied the feasibility of evaluating tumor control probability (TCP) reductions for tumor motion beyond planned gated radiotherapy margins. Tumor motion was determined from cone-beam CT projections acquired for patient setup, intrafraction respiratory traces, and 4D CTs for five non-small cell lung cancer (NSCLC) patients treated with gated radiotherapy. Tumors were subdivided into 1 mm sections whose positions and doses were determined for each beam-on time point. (The dose calculation model was verified with motion phantom measurements.) The calculated dose distributions were used to generate the treatment TCPs for each patient. The plan TCPs were calculated from the treatment planning dose distributions. The treatment TCPs were compared to the plan TCPs for various models and parameters. Calculated doses matched phantom measurements within 0.3% for up to 3 cm of motion. TCP reductions for excess motion greater than 5mm ranged from 1.7% to 11.9%, depending on model parameters, and were as high as 48.6% for model parameters that simulated an individual patient. Repeating the worst case motion for all fractions increased TCP reductions by a factor of 2 to 3, while hypofractionation decreased these reductions by as much as a factor of 3. Treatment motion exceeding gating margins by more than 5 mm can lead to considerable TCP reductions. Appropriate margins for excess motion are recommended, unless applying daily tumor motion verification and adjusting thegating window.

  14. Survival curve estimation with dependent left truncated data using Cox's model.

    PubMed

    Mackenzie, Todd

    2012-10-19

    The Kaplan-Meier and closely related Lynden-Bell estimators are used to provide nonparametric estimation of the distribution of a left-truncated random variable. These estimators assume that the left-truncation variable is independent of the time-to-event. This paper proposes a semiparametric method for estimating the marginal distribution of the time-to-event that does not require independence. It models the conditional distribution of the time-to-event given the truncation variable using Cox's model for left truncated data, and uses inverse probability weighting. We report the results of simulations and illustrate the method using a survival study.

  15. Pollinator communities in strawberry crops - variation at multiple spatial scales.

    PubMed

    Ahrenfeldt, E J; Klatt, B K; Arildsen, J; Trandem, N; Andersson, G K S; Tscharntke, T; Smith, H G; Sigsgaard, L

    2015-08-01

    Predicting potential pollination services of wild bees in crops requires knowledge of their spatial distribution within fields. Field margins can serve as nesting and foraging habitats for wild bees and can be a source of pollinators. Regional differences in pollinator community composition may affect this spill-over of bees. We studied how regional and local differences affect the spatial distribution of wild bee species richness, activity-density and body size in crop fields. We sampled bees both from the field centre and at two different types of semi-natural field margins, grass strips and hedges, in 12 strawberry fields. The fields were distributed over four regions in Northern Europe, representing an almost 1100 km long north-south gradient. Even over this gradient, daytime temperatures during sampling did not differ significantly between regions and did therefore probably not impact bee activity. Bee species richness was higher in field margins compared with field centres independent of field size. However, there was no difference between centre and margin in body-size or activity-density. In contrast, bee activity-density increased towards the southern regions, whereas the mean body size increased towards the north. In conclusion, our study revealed a general pattern across European regions of bee diversity, but not activity-density, declining towards the field interior which suggests that the benefits of functional diversity of pollinators may be difficult to achieve through spill-over effects from margins to crop. We also identified dissimilar regional patterns in bee diversity and activity-density, which should be taken into account in conservation management.

  16. Probabilities of good, marginal, and poor flying conditions for space shuttle ferry flights

    NASA Technical Reports Server (NTRS)

    Whiting, D. M.; Guttman, N. B.

    1977-01-01

    Empirical probabilities are provided for good, marginal, and poor flying weather for ferrying the Space Shuttle Orbiter from Edwards AFB, California, to Kennedy Space Center, Florida, and from Edwards AFB to Marshall Space Flight Center, Alabama. Results are given by month for each overall route plus segments of each route. The criteria for defining a day as good, marginal, or poor and the method of computing the relative frequencies and conditional probabilities for monthly reference periods are described.

  17. Analytical approach to an integrate-and-fire model with spike-triggered adaptation

    NASA Astrophysics Data System (ADS)

    Schwalger, Tilo; Lindner, Benjamin

    2015-12-01

    The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.

  18. Exploring the Subtleties of Inverse Probability Weighting and Marginal Structural Models.

    PubMed

    Breskin, Alexander; Cole, Stephen R; Westreich, Daniel

    2018-05-01

    Since being introduced to epidemiology in 2000, marginal structural models have become a commonly used method for causal inference in a wide range of epidemiologic settings. In this brief report, we aim to explore three subtleties of marginal structural models. First, we distinguish marginal structural models from the inverse probability weighting estimator, and we emphasize that marginal structural models are not only for longitudinal exposures. Second, we explore the meaning of the word "marginal" in "marginal structural model." Finally, we show that the specification of a marginal structural model can have important implications for the interpretation of its parameters. Each of these concepts have important implications for the use and understanding of marginal structural models, and thus providing detailed explanations of them may lead to better practices for the field of epidemiology.

  19. Invariance in the recurrence of large returns and the validation of models of price dynamics

    NASA Astrophysics Data System (ADS)

    Chang, Lo-Bin; Geman, Stuart; Hsieh, Fushing; Hwang, Chii-Ruey

    2013-08-01

    Starting from a robust, nonparametric definition of large returns (“excursions”), we study the statistics of their occurrences, focusing on the recurrence process. The empirical waiting-time distribution between excursions is remarkably invariant to year, stock, and scale (return interval). This invariance is related to self-similarity of the marginal distributions of returns, but the excursion waiting-time distribution is a function of the entire return process and not just its univariate probabilities. Generalized autoregressive conditional heteroskedasticity (GARCH) models, market-time transformations based on volume or trades, and generalized (Lévy) random-walk models all fail to fit the statistical structure of excursions.

  20. The effect of on-line position correction on the dose distribution in focal radiotherapy for bladder cancer

    PubMed Central

    van Rooijen, Dominique C; van de Kamer, Jeroen B; Pool, René; Hulshof, Maarten CCM; Koning, Caro CE; Bel, Arjan

    2009-01-01

    Background The purpose of this study was to determine the dosimetric effect of on-line position correction for bladder tumor irradiation and to find methods to predict and handle this effect. Methods For 25 patients with unifocal bladder cancer intensity modulated radiotherapy (IMRT) with 5 beams was planned. The requirement for each plan was that 99% of the target volume received 95% of the prescribed dose. Tumor displacements from -2.0 cm to 2.0 cm in each dimension were simulated, using 0.5 cm increments, resulting in 729 simulations per patient. We assumed that on-line correction for the tumor was applied perfectly. We determined the correlation between the change in D99% and the change in path length, which is defined here as the distance from the skin to the isocenter for each beam. In addition the margin needed to avoid underdosage was determined and the probability that an underdosage occurs in a real treatment was calculated. Results Adjustments for tumor displacement with perfect on-line position correction resulted in an altered dose distribution. The altered fraction dose to the target varied from 91.9% to 100.4% of the prescribed dose. The mean D99% (± SD) was 95.8% ± 1.0%. There was a modest linear correlation between the difference in D99% and the change in path length of the beams after correction (R2 = 0.590). The median probability that a systematic underdosage occurs in a real treatment was 0.23% (range: 0 - 24.5%). A margin of 2 mm reduced that probability to < 0.001% in all patients. Conclusion On-line position correction does result in an altered target coverage, due to changes in average path length after position correction. An extra margin can be added to prevent underdosage. PMID:19775479

  1. Mercury profiles in sediment from the marginal high of Arabian Sea: an indicator of increasing anthropogenic Hg input.

    PubMed

    Chakraborty, Parthasarathi; Vudamala, Krushna; Chennuri, Kartheek; Armoury, Kazip; Linsy, P; Ramteke, Darwin; Sebastian, Tyson; Jayachandran, Saranya; Naik, Chandan; Naik, Richita; Nath, B Nagender

    2016-05-01

    Total Hg distributions and its speciation were determined in two sediment cores collected from the western continental marginal high of India. Total Hg content in the sediment was found to gradually increase (by approximately two times) towards the surface in both the cores. It was found that Hg was preferentially bound to sulfide under anoxic condition. However, redox-mediated reactions in the upper part of the core influenced the total Hg content in the sediment cores. This study suggests that probable increase in authigenic and allogenic Hg deposition attributed to the increasing Hg concentration in the surface sediment in the study area.

  2. An efficient algorithm to compute marginal posterior genotype probabilities for every member of a pedigree with loops

    PubMed Central

    2009-01-01

    Background Marginal posterior genotype probabilities need to be computed for genetic analyses such as geneticcounseling in humans and selective breeding in animal and plant species. Methods In this paper, we describe a peeling based, deterministic, exact algorithm to compute efficiently genotype probabilities for every member of a pedigree with loops without recourse to junction-tree methods from graph theory. The efficiency in computing the likelihood by peeling comes from storing intermediate results in multidimensional tables called cutsets. Computing marginal genotype probabilities for individual i requires recomputing the likelihood for each of the possible genotypes of individual i. This can be done efficiently by storing intermediate results in two types of cutsets called anterior and posterior cutsets and reusing these intermediate results to compute the likelihood. Examples A small example is used to illustrate the theoretical concepts discussed in this paper, and marginal genotype probabilities are computed at a monogenic disease locus for every member in a real cattle pedigree. PMID:19958551

  3. Modeling the Dependency Structure of Integrated Intensity Processes

    PubMed Central

    Ma, Yong-Ki

    2015-01-01

    This paper studies an important issue of dependence structure. To model this structure, the intensities within the Cox processes are driven by dependent shot noise processes, where jumps occur simultaneously and their sizes are correlated. The joint survival probability of the integrated intensities is explicitly obtained from the copula with exponential marginal distributions. Subsequently, this result can provide a very useful guide for credit risk management. PMID:26270638

  4. Seabed fluid expulsion along the upper slope and outer shelf of the U.S. Atlantic continental margin

    USGS Publications Warehouse

    Brothers, D.S.; Ruppel, C.; Kluesner, J.W.; ten Brink, Uri S.; Chaytor, J.D.; Hill, J.C.; Andrews, B.D.; Flores, C.

    2014-01-01

    Identifying the spatial distribution of seabed fluid expulsion features is crucial for understanding the substrate plumbing system of any continental margin. A 1100 km stretch of the U.S. Atlantic margin contains more than 5000 pockmarks at water depths of 120 m (shelf edge) to 700 m (upper slope), mostly updip of the contemporary gas hydrate stability zone (GHSZ). Advanced attribute analyses of high-resolution multichannel seismic reflection data reveal gas-charged sediment and probable fluid chimneys beneath pockmark fields. A series of enhanced reflectors, inferred to represent hydrate-bearing sediments, occur within the GHSZ. Differential sediment loading at the shelf edge and warming-induced gas hydrate dissociation along the upper slope are the proposed mechanisms that led to transient changes in substrate pore fluid overpressure, vertical fluid/gas migration, and pockmark formation.

  5. Effect of posterior crown margin placement on gingival health.

    PubMed

    Reitemeier, Bernd; Hänsel, Kristina; Walter, Michael H; Kastner, Christian; Toutenburg, Helge

    2002-02-01

    The clinical impact of posterior crown margin placement on gingival health has not been thoroughly quantified. This study evaluated the effect of posterior crown margin placement with multivariate analysis. Ten general dentists reviewed 240 patients with 480 metal-ceramic crowns in a prospective clinical trial. The alloy was randomly selected from 2 high gold, 1 low gold, and 1 palladium alloy. Variables were the alloy used, oral hygiene index score before treatment, location of crown margins at baseline, and plaque index and sulcus bleeding index scores recorded for restored and control teeth after 1 year. The effect of crown margin placement on sulcular bleeding and plaque accumulation was analyzed with regression models (P<.05). The probability of plaque at 1 year increased with increasing oral hygiene index score before treatment. The lingual surfaces demonstrated the highest probability of plaque. The risk of bleeding at intrasulcular posterior crown margins was approximately twice that at supragingival margins. Poor oral hygiene before treatment and plaque also were associated with sulcular bleeding. Facial sites exhibited a lower probability of sulcular bleeding than lingual surfaces. Type of alloy did not influence sulcular bleeding. In this study, placement of crown margins was one of several parameters that affected gingival health.

  6. Post-glacial redistribution and shifts in productivity of giant kelp forests

    PubMed Central

    Graham, Michael H.; Kinlan, Brian P.; Grosberg, Richard K.

    2010-01-01

    Quaternary glacial–interglacial cycles create lasting biogeographic, demographic and genetic effects on ecosystems, yet the ecological effects of ice ages on benthic marine communities are unknown. We analysed long-term datasets to develop a niche-based model of southern Californian giant kelp (Macrocystis pyrifera) forest distribution as a function of oceanography and geomorphology, and synthesized palaeo-oceanographic records to show that late Quaternary climate change probably drove high millennial variability in the distribution and productivity of this foundation species. Our predictions suggest that kelp forest biomass increased up to threefold from the glacial maximum to the mid-Holocene, then rapidly declined by 40–70 per cent to present levels. The peak in kelp forest productivity would have coincided with the earliest coastal archaeological sites in the New World. Similar late Quaternary changes in kelp forest distribution and productivity probably occurred in coastal upwelling systems along active continental margins worldwide, which would have resulted in complex shifts in the relative productivity of terrestrial and marine components of coastal ecosystems. PMID:19846450

  7. Encounter risk analysis of rainfall and reference crop evapotranspiration in the irrigation district

    NASA Astrophysics Data System (ADS)

    Zhang, Jinping; Lin, Xiaomin; Zhao, Yong; Hong, Yang

    2017-09-01

    Rainfall and reference crop evapotranspiration are random but mutually affected variables in the irrigation district, and their encounter situation can determine water shortage risks under the contexts of natural water supply and demand. However, in reality, the rainfall and reference crop evapotranspiration may have different marginal distributions and their relations are nonlinear. In this study, based on the annual rainfall and reference crop evapotranspiration data series from 1970 to 2013 in the Luhun irrigation district of China, the joint probability distribution of rainfall and reference crop evapotranspiration are developed with the Frank copula function. Using the joint probability distribution, the synchronous-asynchronous encounter risk, conditional joint probability, and conditional return period of different combinations of rainfall and reference crop evapotranspiration are analyzed. The results show that the copula-based joint probability distributions of rainfall and reference crop evapotranspiration are reasonable. The asynchronous encounter probability of rainfall and reference crop evapotranspiration is greater than their synchronous encounter probability, and the water shortage risk associated with meteorological drought (i.e. rainfall variability) is more prone to appear. Compared with other states, there are higher conditional joint probability and lower conditional return period in either low rainfall or high reference crop evapotranspiration. For a specifically high reference crop evapotranspiration with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is increased with the decrease in frequency. For a specifically low rainfall with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is decreased with the decrease in frequency. When either the high reference crop evapotranspiration exceeds a certain frequency or low rainfall does not exceed a certain frequency, the higher conditional joint probability and lower conditional return period of various combinations likely cause a water shortage, but the water shortage is not severe.

  8. Statistics of Stokes variables for correlated Gaussian fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eliyahu, D.

    1994-09-01

    The joint and marginal probability distribution functions of the Stokes variables are derived for correlated Gaussian fields [an extension of D. Eliyahu, Phys. Rev. E 47, 2881 (1993)]. The statistics depend only on the first moment (averaged) Stokes variables and have a universal form for [ital S][sub 1], [ital S][sub 2], and [ital S][sub 3]. The statistics of the variables describing the Cartesian coordinates of the Poincare sphere are given also.

  9. Prediction future asset price which is non-concordant with the historical distribution

    NASA Astrophysics Data System (ADS)

    Seong, Ng Yew; Hin, Pooi Ah

    2015-12-01

    This paper attempts to predict the major characteristics of the future asset price which is non-concordant with the distribution estimated from the price today and the prices on a large number of previous days. The three major characteristics of the i-th non-concordant asset price are the length of the interval between the occurrence time of the previous non-concordant asset price and that of the present non-concordant asset price, the indicator which denotes that the non-concordant price is extremely small or large by its values -1 and 1 respectively, and the degree of non-concordance given by the negative logarithm of the probability of the left tail or right tail of which one of the end points is given by the observed future price. The vector of three major characteristics of the next non-concordant price is modelled to be dependent on the vectors corresponding to the present and l - 1 previous non-concordant prices via a 3-dimensional conditional distribution which is derived from a 3(l + 1)-dimensional power-normal mixture distribution. The marginal distribution for each of the three major characteristics can then be derived from the conditional distribution. The mean of the j-th marginal distribution is an estimate of the value of the j-th characteristics of the next non-concordant price. Meanwhile, the 100(α/2) % and 100(1 - α/2) % points of the j-th marginal distribution can be used to form a prediction interval for the j-th characteristic of the next non-concordant price. The performance measures of the above estimates and prediction intervals indicate that the fitted conditional distribution is satisfactory. Thus the incorporation of the distribution of the characteristics of the next non-concordant price in the model for asset price has a good potential of yielding a more realistic model.

  10. Bayes classification of terrain cover using normalized polarimetric data

    NASA Technical Reports Server (NTRS)

    Yueh, H. A.; Swartz, A. A.; Kong, J. A.; Shin, R. T.; Novak, L. M.

    1988-01-01

    The normalized polarimetric classifier (NPC) which uses only the relative magnitudes and phases of the polarimetric data is proposed for discrimination of terrain elements. The probability density functions (PDFs) of polarimetric data are assumed to have a complex Gaussian distribution, and the marginal PDF of the normalized polarimetric data is derived by adopting the Euclidean norm as the normalization function. The general form of the distance measure for the NPC is also obtained. It is demonstrated that for polarimetric data with an arbitrary PDF, the distance measure of NPC will be independent of the normalization function selected even when the classifier is mistrained. A complex Gaussian distribution is assumed for the polarimetric data consisting of grass and tree regions. The probability of error for the NPC is compared with those of several other single-feature classifiers. The classification error of NPCs is shown to be independent of the normalization function.

  11. Interaction of the Siberian craton and Central Asian Orogenic Belt (CAOB) recorded by detrital zircons from Transbaikalia

    NASA Astrophysics Data System (ADS)

    Powerman, V.; Shatsillo, A.; Chumakov, N.; Kapitonov, I.; Hourigan, J. K.

    2015-12-01

    The goal of this study is to pinpoint the beginning of interaction of two gigantic crustal structures: the Siberian Craton and the Central Asian Orogenic Belt (CAOB). We hypothesize that the beginning of convergence should be recorded in the Neoproterozoic passive margin strata of Siberian Craton by the first appearance of extraregional Neoproterozoic zircons. In order to test this hypothesis, we have acquired U-Pb zircon age distributions from twelve Neoproterozoic clastic rocks from the Baikal-Patom margin of Siberia and one sample from the volcaniclastic Padrinsky Group that was deposited atop accreted CAOB crust. Stratigraphically lower strata from the Siberian margin yield Archean - Paleoproterozoic detrital zircon ages, which are similar to, and probably derived from the Siberian Precambrian craton. A few extra-regional Mesoproterozoic grains are also present. The provenance shift happens in the upper portion of the section and is marked by a strong influx of extra-regional Neoproterozoic sediments. The youngest grains of 610 Ma constrain the sedimentation age and confine the timing of interaction between CAOB and Siberia in this region. Neoproterozoic zircons also dominate the overlying sedimentary unit, suggesting the continuance of the convergence. The coeval volcanoclastic unit on the CAOB side has a similar U-Pb detrital age distribution, strengthening the provenance link. Analysis of the local tectonics suggests that the beginning of accretion might have started even before the first appearance of Neoproterozoic zircon: during the development of a regional unconformity, capped by 635 Ma (?) "Snowball Earth" tillites of Dzhemkukan Fm. The absence of Neoproterozoic zircons in Dzhemkukan Fm. is probably explained by a thin-skinned tectonics that did not result in massive orogenesis . Our data are in good correlation with other Neoproterozoic sedimentary basins of southern Siberian Craton, including Cisbaikalia and Bodaibo Synclinorium.

  12. New approach in bivariate drought duration and severity analysis

    NASA Astrophysics Data System (ADS)

    Montaseri, Majid; Amirataee, Babak; Rezaie, Hossein

    2018-04-01

    The copula functions have been widely applied as an advance technique to create joint probability distribution of drought duration and severity. The approach of data collection as well as the amount of data and dispersion of data series can last a significant impact on creating such joint probability distribution using copulas. Usually, such traditional analyses have shed an Unconnected Drought Runs (UDR) approach towards droughts. In other word, droughts with different durations would be independent of each other. Emphasis on such data collection method causes the omission of actual potentials of short-term extreme droughts located within a long-term UDR. Meanwhile, traditional method is often faced with significant gap in drought data series. However, a long-term UDR can be approached as a combination of short-term Connected Drought Runs (CDR). Therefore this study aims to evaluate systematically two UDR and CDR procedures in joint probability of drought duration and severity investigations. For this purpose, rainfall data (1971-2013) from 24 rain gauges in Lake Urmia basin, Iran were applied. Also, seven common univariate marginal distributions and seven types of bivariate copulas were examined. Compared to traditional approach, the results demonstrated a significant comparative advantage of the new approach. Such comparative advantages led to determine the correct copula function, more accurate estimation of copula parameter, more realistic estimation of joint/conditional probabilities of drought duration and severity and significant reduction in uncertainty for modeling.

  13. A New Approach in Generating Meteorological Forecasts for Ensemble Streamflow Forecasting using Multivariate Functions

    NASA Astrophysics Data System (ADS)

    Khajehei, S.; Madadgar, S.; Moradkhani, H.

    2014-12-01

    The reliability and accuracy of hydrological predictions are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model parameters and model structure. To reduce the total uncertainty in hydrological applications, one approach is to reduce the uncertainty in meteorological forcing by using the statistical methods based on the conditional probability density functions (pdf). However, one of the requirements for current methods is to assume the Gaussian distribution for the marginal distribution of the observed and modeled meteorology. Here we propose a Bayesian approach based on Copula functions to develop the conditional distribution of precipitation forecast needed in deriving a hydrologic model for a sub-basin in the Columbia River Basin. Copula functions are introduced as an alternative approach in capturing the uncertainties related to meteorological forcing. Copulas are multivariate joint distribution of univariate marginal distributions, which are capable to model the joint behavior of variables with any level of correlation and dependency. The method is applied to the monthly forecast of CPC with 0.25x0.25 degree resolution to reproduce the PRISM dataset over 1970-2000. Results are compared with Ensemble Pre-Processor approach as a common procedure used by National Weather Service River forecast centers in reproducing observed climatology during a ten-year verification period (2000-2010).

  14. Seismic potential for large and great interplate earthquakes along the Chilean and Southern Peruvian Margins of South America: A quantitative reappraisal

    NASA Astrophysics Data System (ADS)

    Nishenko, Stuart P.

    1985-04-01

    The seismic potential of the Chilean and southern Peruvian margins of South America is reevaluated to delineate those areas or segments of the margin that may be expected to experience large or great interplate earthquakes within the next 20 years (1984-2004). Long-term estimates of seismic potential (or the conditional probability of recurrence within a specified period of time) are based on (1) statistical analysis of historic repeat time data using Weibull distributions and (2) deterministic estimates of recurrence times based on the time-predictable model of earthquake recurrence. Both methods emphasize the periodic nature of large and great earthquake recurrence, and are compared with estimates of probability based on the assumption of Poisson-type behavior. The estimates of seismic potential presented in this study are long-term forecasts only, as the temporal resolution (or standard deviation) of both methods is taken to range from ±15% to ±25% of the average or estimated repeat time. At present, the Valparaiso region of central Chile (32°-35°S) has a high potential or probability of recurrence in the next 20 years. Coseismic uplift data associated with previous shocks in 1822 and 1906 suggest that this area may have already started to rerupture in 1971-1973. Average repeat times also suggest this area is due for a great shock within the next 20 years. Flanking segments of the Chilean margin, Coquimbo-Illapel (30°-32°S) and Talca-Concepcion (35°-38°S), presently have poorly constrained but possibly quite high potentials for a series of large or great shocks within the next 20 years. In contrast, the rupture zone of the great 1960 earthquake (37°-46°S) has the lowest potential along the margin and is not expected to rerupture in a great earthquake within the next 100 years. In the north, the seismic potentials of the Mollendo-Arica (17°-18°S) and Arica-Antofagasta (18°-24°S) segments (which last ruptured during great earthquakes in 1868 and 1877) are also high, but poorly constrained.

  15. Comparison of Bootstrapping and Markov Chain Monte Carlo for Copula Analysis of Hydrological Droughts

    NASA Astrophysics Data System (ADS)

    Yang, P.; Ng, T. L.; Yang, W.

    2015-12-01

    Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.

  16. Meaner king uses biased bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reimpell, Michael; Werner, Reinhard F.

    2007-06-15

    The mean king problem is a quantum mechanical retrodiction problem, in which Alice has to name the outcome of an ideal measurement made in one of several different orthonormal bases. Alice is allowed to prepare the state of the system and to do a final measurement, possibly including an entangled copy. However, Alice gains knowledge about which basis was measured only after she no longer has access to the quantum system or its copy. We give a necessary and sufficient condition on the bases, for Alice to have a strategy to solve this problem, without assuming that the bases aremore » mutually unbiased. The condition requires the existence of an overall joint probability distribution for random variables, whose marginal pair distributions are fixed as the transition probability matrices of the given bases. In particular, in the qubit case the problem is decided by Bell's original three variable inequality. In the standard setting of mutually unbiased bases, when they do exist, Alice can always succeed. However, for randomly chosen bases her success probability rapidly goes to zero with increasing dimension.« less

  17. Meaner king uses biased bases

    NASA Astrophysics Data System (ADS)

    Reimpell, Michael; Werner, Reinhard F.

    2007-06-01

    The mean king problem is a quantum mechanical retrodiction problem, in which Alice has to name the outcome of an ideal measurement made in one of several different orthonormal bases. Alice is allowed to prepare the state of the system and to do a final measurement, possibly including an entangled copy. However, Alice gains knowledge about which basis was measured only after she no longer has access to the quantum system or its copy. We give a necessary and sufficient condition on the bases, for Alice to have a strategy to solve this problem, without assuming that the bases are mutually unbiased. The condition requires the existence of an overall joint probability distribution for random variables, whose marginal pair distributions are fixed as the transition probability matrices of the given bases. In particular, in the qubit case the problem is decided by Bell’s original three variable inequality. In the standard setting of mutually unbiased bases, when they do exist, Alice can always succeed. However, for randomly chosen bases her success probability rapidly goes to zero with increasing dimension.

  18. Isotropic probability measures in infinite-dimensional spaces

    NASA Technical Reports Server (NTRS)

    Backus, George

    1987-01-01

    Let R be the real numbers, R(n) the linear space of all real n-tuples, and R(infinity) the linear space of all infinite real sequences x = (x sub 1, x sub 2,...). Let P sub in :R(infinity) approaches R(n) be the projection operator with P sub n (x) = (x sub 1,...,x sub n). Let p(infinity) be a probability measure on the smallest sigma-ring of subsets of R(infinity) which includes all of the cylinder sets P sub n(-1) (B sub n), where B sub n is an arbitrary Borel subset of R(n). Let p sub n be the marginal distribution of p(infinity) on R(n), so p sub n(B sub n) = p(infinity) (P sub n to the -1 (B sub n)) for each B sub n. A measure on R(n) is isotropic if it is invariant under all orthogonal transformations of R(n). All members of the set of all isotropic probability distributions on R(n) are described. The result calls into question both stochastic inversion and Bayesian inference, as currently used in many geophysical inverse problems.

  19. SU-F-T-450: The Investigation of Radiotherapy Quality Assurance and Automatic Treatment Planning Based On the Kernel Density Estimation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, J; Fan, J; Hu, W

    Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditionalmore » probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.« less

  20. The Impact of Spatial and Temporal Resolutions in Tropical Summer Rainfall Distribution: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Chiu, L. S.; Hao, X.

    2017-10-01

    The abundance or lack of rainfall affects peoples' life and activities. As a major component of the global hydrological cycle (Chokngamwong & Chiu, 2007), accurate representations at various spatial and temporal scales are crucial for a lot of decision making processes. Climate models show a warmer and wetter climate due to increases of Greenhouse Gases (GHG). However, the models' resolutions are often too coarse to be directly applicable to local scales that are useful for mitigation purposes. Hence disaggregation (downscaling) procedures are needed to transfer the coarse scale products to higher spatial and temporal resolutions. The aim of this paper is to examine the changes in the statistical parameters of rainfall at various spatial and temporal resolutions. The TRMM Multi-satellite Precipitation Analysis (TMPA) at 0.25 degree, 3 hourly grid rainfall data for a summer is aggregated to 0.5,1.0, 2.0 and 2.5 degree and at 6, 12, 24 hourly, pentad (five days) and monthly resolutions. The probability distributions (PDF) and cumulative distribution functions(CDF) of rain amount at these resolutions are computed and modeled as a mixed distribution. Parameters of the PDFs are compared using the Kolmogrov-Smironov (KS) test, both for the mixed and the marginal distribution. These distributions are shown to be distinct. The marginal distributions are fitted with Lognormal and Gamma distributions and it is found that the Gamma distributions fit much better than the Lognormal.

  1. Second look at the spread of epidemics on networks

    NASA Astrophysics Data System (ADS)

    Kenah, Eben; Robins, James M.

    2007-09-01

    In an important paper, Newman [Phys. Rev. E66, 016128 (2002)] claimed that a general network-based stochastic Susceptible-Infectious-Removed (SIR) epidemic model is isomorphic to a bond percolation model, where the bonds are the edges of the contact network and the bond occupation probability is equal to the marginal probability of transmission from an infected node to a susceptible neighbor. In this paper, we show that this isomorphism is incorrect and define a semidirected random network we call the epidemic percolation network that is exactly isomorphic to the SIR epidemic model in any finite population. In the limit of a large population, (i) the distribution of (self-limited) outbreak sizes is identical to the size distribution of (small) out-components, (ii) the epidemic threshold corresponds to the phase transition where a giant strongly connected component appears, (iii) the probability of a large epidemic is equal to the probability that an initial infection occurs in the giant in-component, and (iv) the relative final size of an epidemic is equal to the proportion of the network contained in the giant out-component. For the SIR model considered by Newman, we show that the epidemic percolation network predicts the same mean outbreak size below the epidemic threshold, the same epidemic threshold, and the same final size of an epidemic as the bond percolation model. However, the bond percolation model fails to predict the correct outbreak size distribution and probability of an epidemic when there is a nondegenerate infectious period distribution. We confirm our findings by comparing predictions from percolation networks and bond percolation models to the results of simulations. In the Appendix, we show that an isomorphism to an epidemic percolation network can be defined for any time-homogeneous stochastic SIR model.

  2. Bivariate categorical data analysis using normal linear conditional multinomial probability model.

    PubMed

    Sun, Bingrui; Sutradhar, Brajendra

    2015-02-10

    Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.

  3. The Two-Dimensional Gabor Function Adapted to Natural Image Statistics: A Model of Simple-Cell Receptive Fields and Sparse Structure in Images.

    PubMed

    Loxley, P N

    2017-10-01

    The two-dimensional Gabor function is adapted to natural image statistics, leading to a tractable probabilistic generative model that can be used to model simple cell receptive field profiles, or generate basis functions for sparse coding applications. Learning is found to be most pronounced in three Gabor function parameters representing the size and spatial frequency of the two-dimensional Gabor function and characterized by a nonuniform probability distribution with heavy tails. All three parameters are found to be strongly correlated, resulting in a basis of multiscale Gabor functions with similar aspect ratios and size-dependent spatial frequencies. A key finding is that the distribution of receptive-field sizes is scale invariant over a wide range of values, so there is no characteristic receptive field size selected by natural image statistics. The Gabor function aspect ratio is found to be approximately conserved by the learning rules and is therefore not well determined by natural image statistics. This allows for three distinct solutions: a basis of Gabor functions with sharp orientation resolution at the expense of spatial-frequency resolution, a basis of Gabor functions with sharp spatial-frequency resolution at the expense of orientation resolution, or a basis with unit aspect ratio. Arbitrary mixtures of all three cases are also possible. Two parameters controlling the shape of the marginal distributions in a probabilistic generative model fully account for all three solutions. The best-performing probabilistic generative model for sparse coding applications is found to be a gaussian copula with Pareto marginal probability density functions.

  4. The discrete Laplace exponential family and estimation of Y-STR haplotype frequencies.

    PubMed

    Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels

    2013-07-21

    Estimating haplotype frequencies is important in e.g. forensic genetics, where the frequencies are needed to calculate the likelihood ratio for the evidential weight of a DNA profile found at a crime scene. Estimation is naturally based on a population model, motivating the investigation of the Fisher-Wright model of evolution for haploid lineage DNA markers. An exponential family (a class of probability distributions that is well understood in probability theory such that inference is easily made by using existing software) called the 'discrete Laplace distribution' is described. We illustrate how well the discrete Laplace distribution approximates a more complicated distribution that arises by investigating the well-known population genetic Fisher-Wright model of evolution by a single-step mutation process. It was shown how the discrete Laplace distribution can be used to estimate haplotype frequencies for haploid lineage DNA markers (such as Y-chromosomal short tandem repeats), which in turn can be used to assess the evidential weight of a DNA profile found at a crime scene. This was done by making inference in a mixture of multivariate, marginally independent, discrete Laplace distributions using the EM algorithm to estimate the probabilities of membership of a set of unobserved subpopulations. The discrete Laplace distribution can be used to estimate haplotype frequencies with lower prediction error than other existing estimators. Furthermore, the calculations could be performed on a normal computer. This method was implemented in the freely available open source software R that is supported on Linux, MacOS and MS Windows. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. The use of spatial dose gradients and probability density function to evaluate the effect of internal organ motion for prostate IMRT treatment planning

    NASA Astrophysics Data System (ADS)

    Jiang, Runqing; Barnett, Rob B.; Chow, James C. L.; Chen, Jeff Z. Y.

    2007-03-01

    The aim of this study is to investigate the effects of internal organ motion on IMRT treatment planning of prostate patients using a spatial dose gradient and probability density function. Spatial dose distributions were generated from a Pinnacle3 planning system using a co-planar, five-field intensity modulated radiation therapy (IMRT) technique. Five plans were created for each patient using equally spaced beams but shifting the angular displacement of the beam by 15° increments. Dose profiles taken through the isocentre in anterior-posterior (A-P), right-left (R-L) and superior-inferior (S-I) directions for IMRT plans were analysed by exporting RTOG file data from Pinnacle. The convolution of the 'static' dose distribution D0(x, y, z) and probability density function (PDF), denoted as P(x, y, z), was used to analyse the combined effect of repositioning error and internal organ motion. Organ motion leads to an enlarged beam penumbra. The amount of percentage mean dose deviation (PMDD) depends on the dose gradient and organ motion probability density function. Organ motion dose sensitivity was defined by the rate of change in PMDD with standard deviation of motion PDF and was found to increase with the maximum dose gradient in anterior, posterior, left and right directions. Due to common inferior and superior field borders of the field segments, the sharpest dose gradient will occur in the inferior or both superior and inferior penumbrae. Thus, prostate motion in the S-I direction produces the highest dose difference. The PMDD is within 2.5% when standard deviation is less than 5 mm, but the PMDD is over 2.5% in the inferior direction when standard deviation is higher than 5 mm in the inferior direction. Verification of prostate organ motion in the inferior directions is essential. The margin of the planning target volume (PTV) significantly impacts on the confidence of tumour control probability (TCP) and level of normal tissue complication probability (NTCP). Smaller margins help to reduce the dose to normal tissues, but may compromise the dose coverage of the PTV. Lower rectal NTCP can be achieved by either a smaller margin or a steeper dose gradient between PTV and rectum. With the same DVH control points, the rectum has lower complication in the seven-beam technique used in this study because of the steeper dose gradient between the target volume and rectum. The relationship between dose gradient and rectal complication can be used to evaluate IMRT treatment planning. The dose gradient analysis is a powerful tool to improve IMRT treatment plans and can be used for QA checking of treatment plans for prostate patients.

  6. The use of spatial dose gradients and probability density function to evaluate the effect of internal organ motion for prostate IMRT treatment planning.

    PubMed

    Jiang, Runqing; Barnett, Rob B; Chow, James C L; Chen, Jeff Z Y

    2007-03-07

    The aim of this study is to investigate the effects of internal organ motion on IMRT treatment planning of prostate patients using a spatial dose gradient and probability density function. Spatial dose distributions were generated from a Pinnacle3 planning system using a co-planar, five-field intensity modulated radiation therapy (IMRT) technique. Five plans were created for each patient using equally spaced beams but shifting the angular displacement of the beam by 15 degree increments. Dose profiles taken through the isocentre in anterior-posterior (A-P), right-left (R-L) and superior-inferior (S-I) directions for IMRT plans were analysed by exporting RTOG file data from Pinnacle. The convolution of the 'static' dose distribution D0(x, y, z) and probability density function (PDF), denoted as P(x, y, z), was used to analyse the combined effect of repositioning error and internal organ motion. Organ motion leads to an enlarged beam penumbra. The amount of percentage mean dose deviation (PMDD) depends on the dose gradient and organ motion probability density function. Organ motion dose sensitivity was defined by the rate of change in PMDD with standard deviation of motion PDF and was found to increase with the maximum dose gradient in anterior, posterior, left and right directions. Due to common inferior and superior field borders of the field segments, the sharpest dose gradient will occur in the inferior or both superior and inferior penumbrae. Thus, prostate motion in the S-I direction produces the highest dose difference. The PMDD is within 2.5% when standard deviation is less than 5 mm, but the PMDD is over 2.5% in the inferior direction when standard deviation is higher than 5 mm in the inferior direction. Verification of prostate organ motion in the inferior directions is essential. The margin of the planning target volume (PTV) significantly impacts on the confidence of tumour control probability (TCP) and level of normal tissue complication probability (NTCP). Smaller margins help to reduce the dose to normal tissues, but may compromise the dose coverage of the PTV. Lower rectal NTCP can be achieved by either a smaller margin or a steeper dose gradient between PTV and rectum. With the same DVH control points, the rectum has lower complication in the seven-beam technique used in this study because of the steeper dose gradient between the target volume and rectum. The relationship between dose gradient and rectal complication can be used to evaluate IMRT treatment planning. The dose gradient analysis is a powerful tool to improve IMRT treatment plans and can be used for QA checking of treatment plans for prostate patients.

  7. Probabilistic objective functions for margin-less IMRT planning

    NASA Astrophysics Data System (ADS)

    Bohoslavsky, Román; Witte, Marnix G.; Janssen, Tomas M.; van Herk, Marcel

    2013-06-01

    We present a method to implement probabilistic treatment planning of intensity-modulated radiation therapy using custom software plugins in a commercial treatment planning system. Our method avoids the definition of safety-margins by directly including the effect of geometrical uncertainties during optimization when objective functions are evaluated. Because the shape of the resulting dose distribution implicitly defines the robustness of the plan, the optimizer has much more flexibility than with a margin-based approach. We expect that this added flexibility helps to automatically strike a better balance between target coverage and dose reduction for surrounding healthy tissue, especially for cases where the planning target volume overlaps organs at risk. Prostate cancer treatment planning was chosen to develop our method, including a novel technique to include rotational uncertainties. Based on population statistics, translations and rotations are simulated independently following a marker-based IGRT correction strategy. The effects of random and systematic errors are incorporated by first blurring and then shifting the dose distribution with respect to the clinical target volume. For simplicity and efficiency, dose-shift invariance and a rigid-body approximation are assumed. Three prostate cases were replanned using our probabilistic objective functions. To compare clinical and probabilistic plans, an evaluation tool was used that explicitly incorporates geometric uncertainties using Monte-Carlo methods. The new plans achieved similar or better dose distributions than the original clinical plans in terms of expected target coverage and rectum wall sparing. Plan optimization times were only about a factor of two higher than in the original clinical system. In conclusion, we have developed a practical planning tool that enables margin-less probability-based treatment planning with acceptable planning times, achieving the first system that is feasible for clinical implementation.

  8. Hotspots in Hindsight

    NASA Astrophysics Data System (ADS)

    Julian, B. R.; Foulger, G. R.; Hatfield, O.; Jackson, S.; Simpson, E.; Einbeck, J.; Moore, A.

    2014-12-01

    Torsvik et al. [2006] suggest that the original locations of large igneous provinces ("LIPs") and kimberlites, and current locations of melting anomalies (hot-spots) lie preferentially above the margins of two Large Lower-Mantle Shear Velocity Provinces" (LLSVPs), at the base of the mantle, and that the correlation has a high significance level (> 99.9999%). They conclude the LLSVP margins are Plume-Generation Zones, and deep-mantle plumes cause hotspots and LIPs. This conclusion raises questions about what physical processes could be responsible, because, for example the LLSVPs are likely dense and not abnormally hot [Trampert et al., 2004]. The supposed LIP-hotspot-LLSVP correlations probably are examples of the "Hindsight Heresy" [Acton, 1959], of basing a statistical test upon the same data sample that led to the initial formulation of a hypothesis. In doing this, many competing hypotheses will have been considered and rejected, but this fact will not be taken into account in statistical assessments. Furthermore, probabilities will be computed for many subsets and combinations of the data, and the best-correlated cases will be cited, but this fact will not be taken into account either. Tests using independent hot-spot catalogs and mantle models suggest that the actual significance levels of the correlations are two or three orders of magnitude smaller than claimed. These tests also show that hot spots correlate well with presumably shallowly rooted features such as spreading plate boundaries. Consideration of the kimberlite dataset in the context of geological setting suggests that their apparent association with the LLSVP margins results from the fact that the Kaapvaal craton, the site of most of the kimberlites considered, lies in Southern Africa. These observations raise questions about the distinction between correlation and causation and underline the necessity to take geological factors into account. Fig: Left: Cumulative distributions of distances from hotspots to nearest ridge for 5 hotspot lists; heavy red curve: Distribution function for a random point on Earth's surface. Hotspots are closer to ridges than expected at random. Right: For each list, the probability of at least as many random points being as close to a ridge. Values to right have higher significance.

  9. HELP: XID+, the probabilistic de-blender for Herschel SPIRE maps

    NASA Astrophysics Data System (ADS)

    Hurley, P. D.; Oliver, S.; Betancourt, M.; Clarke, C.; Cowley, W. I.; Duivenvoorden, S.; Farrah, D.; Griffin, M.; Lacey, C.; Le Floc'h, E.; Papadopoulos, A.; Sargent, M.; Scudder, J. M.; Vaccari, M.; Valtchanov, I.; Wang, L.

    2017-01-01

    We have developed a new prior-based source extraction tool, XID+, to carry out photometry in the Herschel SPIRE (Spectral and Photometric Imaging Receiver) maps at the positions of known sources. XID+ is developed using a probabilistic Bayesian framework that provides a natural framework in which to include prior information, and uses the Bayesian inference tool Stan to obtain the full posterior probability distribution on flux estimates. In this paper, we discuss the details of XID+ and demonstrate the basic capabilities and performance by running it on simulated SPIRE maps resembling the COSMOS field, and comparing to the current prior-based source extraction tool DESPHOT. Not only we show that XID+ performs better on metrics such as flux accuracy and flux uncertainty accuracy, but we also illustrate how obtaining the posterior probability distribution can help overcome some of the issues inherent with maximum-likelihood-based source extraction routines. We run XID+ on the COSMOS SPIRE maps from Herschel Multi-Tiered Extragalactic Survey using a 24-μm catalogue as a positional prior, and a uniform flux prior ranging from 0.01 to 1000 mJy. We show the marginalized SPIRE colour-colour plot and marginalized contribution to the cosmic infrared background at the SPIRE wavelengths. XID+ is a core tool arising from the Herschel Extragalactic Legacy Project (HELP) and we discuss how additional work within HELP providing prior information on fluxes can and will be utilized. The software is available at https://github.com/H-E-L-P/XID_plus. We also provide the data product for COSMOS. We believe this is the first time that the full posterior probability of galaxy photometry has been provided as a data product.

  10. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  11. Radiobiological Impact of Reduced Margins and Treatment Technique for Prostate Cancer in Terms of Tumor Control Probability (TCP) and Normal Tissue Complication Probability (NTCP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Ingelise, E-mail: inje@rn.d; Carl, Jesper; Lund, Bente

    2011-07-01

    Dose escalation in prostate radiotherapy is limited by normal tissue toxicities. The aim of this study was to assess the impact of margin size on tumor control and side effects for intensity-modulated radiation therapy (IMRT) and 3D conformal radiotherapy (3DCRT) treatment plans with increased dose. Eighteen patients with localized prostate cancer were enrolled. 3DCRT and IMRT plans were compared for a variety of margin sizes. A marker detectable on daily portal images was presupposed for narrow margins. Prescribed dose was 82 Gy within 41 fractions to the prostate clinical target volume (CTV). Tumor control probability (TCP) calculations based on themore » Poisson model including the linear quadratic approach were performed. Normal tissue complication probability (NTCP) was calculated for bladder, rectum and femoral heads according to the Lyman-Kutcher-Burman method. All plan types presented essentially identical TCP values and very low NTCP for bladder and femoral heads. Mean doses for these critical structures reached a minimum for IMRT with reduced margins. Two endpoints for rectal complications were analyzed. A marked decrease in NTCP for IMRT plans with narrow margins was seen for mild RTOG grade 2/3 as well as for proctitis/necrosis/stenosis/fistula, for which NTCP <7% was obtained. For equivalent TCP values, sparing of normal tissue was demonstrated with the narrow margin approach. The effect was more pronounced for IMRT than 3DCRT, with respect to NTCP for mild, as well as severe, rectal complications.« less

  12. Neutrino mass priors for cosmology from random matrices

    NASA Astrophysics Data System (ADS)

    Long, Andrew J.; Raveri, Marco; Hu, Wayne; Dodelson, Scott

    2018-02-01

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σ mν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π (Σ mν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix Mν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution over Mν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σ mν that we interpret as a Bayesian prior probability π (Σ mν). Assuming a basis-invariant probability distribution on Mν, also known as the anarchy hypothesis, we find that π (Σ mν) peaks close to the smallest Σ mν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π (Σ mν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. We present fitting functions for π (Σ mν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.

  13. Distribution of a climate-sensitive species at an interior range margin

    USGS Publications Warehouse

    Ray, Chris; Beever, Erik; Rodhouse, Thomas J.

    2016-01-01

    Advances in understanding the factors that limit a species’ range, particularly in the context of climate change, have come disproportionately through investigations at range edges or margins. The margins of a species’ range might often correspond with anomalous microclimates that confer habitat suitability where the species would otherwise fail to persist. We addressed this hypothesis using data from an interior, climatic range margin of the American pika (Ochotona princeps), an indicator of relatively cool, mesic climates in rocky habitats of western North America. Pikas in Lava Beds National Monument, northeastern California, USA, occur at elevations much lower than predicted by latitude and longitude. We hypothesized that pika occurrence within Lava Beds would be associated primarily with features such as “ice caves” in which sub-surface ice persists outside the winter months. We used data loggers to monitor sub-surface temperatures at cave entrances and at non-cave sites, confirming that temperatures were cooler and more stable at cave entrances. We surveyed habitat characteristics and evidence of pika occupancy across a random sample of cave and non-cave sites over a 2-yr period. Pika detection probability was high (~0.97), and the combined occupancy of cave and non-cave sites varied across the 2 yr from 27% to 69%. Contrary to our hypothesis, occupancy was not higher at cave sites. Vegetation metrics were the best predictors of site use by pikas, followed by an edge effect and elevation. The importance of vegetation as a predictor of pika distribution at this interior range margin is congruent with recent studies from other portions of the species’ range. However, we caution that vegetation composition depends on microclimate, which might be the proximal driver of pika distribution. The microclimates available in non-cave crevices accessible to small animals have not been characterized adequately for lava landscapes. We advocate innovation in the acquisition and use of microclimatic data for understanding the distributions of many taxa. Appropriately scaled microclimatic data are increasingly available but rarely used in studies of range dynamics.

  14. Generation of a Multivariate Distribution for Specified Univariate Marginals and Covariance Structure.

    DTIC Science & Technology

    1981-05-28

    x2-+,(y) . F7. y1 (-i! Tf1 ’ 2) 2 f. 2;X (3 1(2 X T= t I (Y) F= I (yI) .(-l I f ill ; X ( 2 ) Combining the one-to-one property of Tf, with (B-4...transformation of probability, dT; (y) g(Y) - f( Tf1 (Y)). ldet ( dY )I = f(TfI (Y))/f(Tf (Y)) =1 Remarks m 1. It immediately follows that Y is uniformly

  15. Multivariate hydrological frequency analysis for extreme events using Archimedean copula. Case study: Lower Tunjuelo River basin (Colombia)

    NASA Astrophysics Data System (ADS)

    Gómez, Wilmar

    2017-04-01

    By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.

  16. Do marginalized neighbourhoods have less healthy retail food environments? An analysis using Bayesian spatial latent factor and hurdle models.

    PubMed

    Luan, Hui; Minaker, Leia M; Law, Jane

    2016-08-22

    Findings of whether marginalized neighbourhoods have less healthy retail food environments (RFE) are mixed across countries, in part because inconsistent approaches have been used to characterize RFE 'healthfulness' and marginalization, and researchers have used non-spatial statistical methods to respond to this ultimately spatial issue. This study uses in-store features to categorize healthy and less healthy food outlets. Bayesian spatial hierarchical models are applied to explore the association between marginalization dimensions and RFE healthfulness (i.e., relative healthy food access that modelled via a probability distribution) at various geographical scales. Marginalization dimensions are derived from a spatial latent factor model. Zero-inflation occurring at the walkable-distance scale is accounted for with a spatial hurdle model. Neighbourhoods with higher residential instability, material deprivation, and population density are more likely to have access to healthy food outlets within a walkable distance from a binary 'have' or 'not have' access perspective. At the walkable distance scale however, materially deprived neighbourhoods are found to have less healthy RFE (lower relative healthy food access). Food intervention programs should be developed for striking the balance between healthy and less healthy food access in the study region as well as improving opportunities for residents to buy and consume foods consistent with dietary recommendations.

  17. Characteristics and features of the submarine landslides in passive and active margin southwestern offshore Taiwan

    NASA Astrophysics Data System (ADS)

    Yeh, Y. C.

    2016-12-01

    In the past decade, numerous multi-channel seismic surveys as well as near seafloor high resolution geophysical investigations were conducted in order to explore and estimate the reserves of gas hydrate southwestern offshore Taiwan. The previous object was focused on searching substitute energy (i.e. gas hydrate) rather than geo-hazards. However, it is suggested that most of the gas hydrate is generally distributed at slope area southwestern offshore Taiwan, which indicates the slope may be failed when steady state was disturbed by some factors, such as sea level or climate change. In addition, once gas hydrate was dissociated, this may induce submarine landslide that further cause devastated tsunami. Thus, it is of great urgency to investigate potential landslide area, particularly, the hydrate-rich continental slope (active and passive margins) in adjacent to populous city like Kaohsiung. In this study, we collected several high resolution multi-channel seismic data with ten seconds shooting rate and 3.125 meters group interval streamer by using R/V ORI and R/V ORV. The seismic data were processed in conventional data processing strategy: bad trace clean, geometry settings, band-pass filter, de-convolution, surface-related multiple rejection, radon filter, stacking,kirchhoff migration and time to depth conversion. Combine the results obtained from the MCS data and subbottom profiles, two major results could be raised in the active margin as followed: (1) Most of the surface creeping and landslide was occurred shallower than 500 meters in water depth, which should be related to the inter-bedded fluid activities. (2) The landslide distribution is lagly affected by the presence of diaper, suggesting the subsequent mud diapirism may destruct slope stability; (3) The submarine landslide deeper than 800 meters in water depth distributes in the thrust fold area, that is probably referred to active thrusting. In the passive margin, large volume mass transportation deposits (MTDs) were identified in deeper stratigraphic section below BSR. This indicated several big former submarine landslide events occurred. In summary, the passive margin often show typical submarine landslide features than active margin, which driven by gravity force.

  18. Stochastic optimal operation of reservoirs based on copula functions

    NASA Astrophysics Data System (ADS)

    Lei, Xiao-hui; Tan, Qiao-feng; Wang, Xu; Wang, Hao; Wen, Xin; Wang, Chao; Zhang, Jing-wen

    2018-02-01

    Stochastic dynamic programming (SDP) has been widely used to derive operating policies for reservoirs considering streamflow uncertainties. In SDP, there is a need to calculate the transition probability matrix more accurately and efficiently in order to improve the economic benefit of reservoir operation. In this study, we proposed a stochastic optimization model for hydropower generation reservoirs, in which 1) the transition probability matrix was calculated based on copula functions; and 2) the value function of the last period was calculated by stepwise iteration. Firstly, the marginal distribution of stochastic inflow in each period was built and the joint distributions of adjacent periods were obtained using the three members of the Archimedean copulas, based on which the conditional probability formula was derived. Then, the value in the last period was calculated by a simple recursive equation with the proposed stepwise iteration method and the value function was fitted with a linear regression model. These improvements were incorporated into the classic SDP and applied to the case study in Ertan reservoir, China. The results show that the transition probability matrix can be more easily and accurately obtained by the proposed copula function based method than conventional methods based on the observed or synthetic streamflow series, and the reservoir operation benefit can also be increased.

  19. Statistical Inference in Hidden Markov Models Using k-Segment Constraints

    PubMed Central

    Titsias, Michalis K.; Holmes, Christopher C.; Yau, Christopher

    2016-01-01

    Hidden Markov models (HMMs) are one of the most widely used statistical methods for analyzing sequence data. However, the reporting of output from HMMs has largely been restricted to the presentation of the most-probable (MAP) hidden state sequence, found via the Viterbi algorithm, or the sequence of most probable marginals using the forward–backward algorithm. In this article, we expand the amount of information we could obtain from the posterior distribution of an HMM by introducing linear-time dynamic programming recursions that, conditional on a user-specified constraint in the number of segments, allow us to (i) find MAP sequences, (ii) compute posterior probabilities, and (iii) simulate sample paths. We collectively call these recursions k-segment algorithms and illustrate their utility using simulated and real examples. We also highlight the prospective and retrospective use of k-segment constraints for fitting HMMs or exploring existing model fits. Supplementary materials for this article are available online. PMID:27226674

  20. iSEDfit: Bayesian spectral energy distribution modeling of galaxies

    NASA Astrophysics Data System (ADS)

    Moustakas, John

    2017-08-01

    iSEDfit uses Bayesian inference to extract the physical properties of galaxies from their observed broadband photometric spectral energy distribution (SED). In its default mode, the inputs to iSEDfit are the measured photometry (fluxes and corresponding inverse variances) and a measurement of the galaxy redshift. Alternatively, iSEDfit can be used to estimate photometric redshifts from the input photometry alone. After the priors have been specified, iSEDfit calculates the marginalized posterior probability distributions for the physical parameters of interest, including the stellar mass, star-formation rate, dust content, star formation history, and stellar metallicity. iSEDfit also optionally computes K-corrections and produces multiple "quality assurance" (QA) plots at each stage of the modeling procedure to aid in the interpretation of the prior parameter choices and subsequent fitting results. The software is distributed as part of the impro IDL suite.

  1. Ladar range image denoising by a nonlocal probability statistics algorithm

    NASA Astrophysics Data System (ADS)

    Xia, Zhi-Wei; Li, Qi; Xiong, Zhi-Peng; Wang, Qi

    2013-01-01

    According to the characteristic of range images of coherent ladar and the basis of nonlocal means (NLM), a nonlocal probability statistics (NLPS) algorithm is proposed in this paper. The difference is that NLM performs denoising using the mean of the conditional probability distribution function (PDF) while NLPS using the maximum of the marginal PDF. In the algorithm, similar blocks are found out by the operation of block matching and form a group. Pixels in the group are analyzed by probability statistics and the gray value with maximum probability is used as the estimated value of the current pixel. The simulated range images of coherent ladar with different carrier-to-noise ratio and real range image of coherent ladar with 8 gray-scales are denoised by this algorithm, and the results are compared with those of median filter, multitemplate order mean filter, NLM, median nonlocal mean filter and its incorporation of anatomical side information, and unsupervised information-theoretic adaptive filter. The range abnormality noise and Gaussian noise in range image of coherent ladar are effectively suppressed by NLPS.

  2. Radiobiological impact of reduced margins and treatment technique for prostate cancer in terms of tumor control probability (TCP) and normal tissue complication probability (NTCP).

    PubMed

    Jensen, Ingelise; Carl, Jesper; Lund, Bente; Larsen, Erik H; Nielsen, Jane

    2011-01-01

    Dose escalation in prostate radiotherapy is limited by normal tissue toxicities. The aim of this study was to assess the impact of margin size on tumor control and side effects for intensity-modulated radiation therapy (IMRT) and 3D conformal radiotherapy (3DCRT) treatment plans with increased dose. Eighteen patients with localized prostate cancer were enrolled. 3DCRT and IMRT plans were compared for a variety of margin sizes. A marker detectable on daily portal images was presupposed for narrow margins. Prescribed dose was 82 Gy within 41 fractions to the prostate clinical target volume (CTV). Tumor control probability (TCP) calculations based on the Poisson model including the linear quadratic approach were performed. Normal tissue complication probability (NTCP) was calculated for bladder, rectum and femoral heads according to the Lyman-Kutcher-Burman method. All plan types presented essentially identical TCP values and very low NTCP for bladder and femoral heads. Mean doses for these critical structures reached a minimum for IMRT with reduced margins. Two endpoints for rectal complications were analyzed. A marked decrease in NTCP for IMRT plans with narrow margins was seen for mild RTOG grade 2/3 as well as for proctitis/necrosis/stenosis/fistula, for which NTCP <7% was obtained. For equivalent TCP values, sparing of normal tissue was demonstrated with the narrow margin approach. The effect was more pronounced for IMRT than 3DCRT, with respect to NTCP for mild, as well as severe, rectal complications. Copyright © 2011 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  3. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  4. Isotropic probability measures in infinite dimensional spaces: Inverse problems/prior information/stochastic inversion

    NASA Technical Reports Server (NTRS)

    Backus, George

    1987-01-01

    Let R be the real numbers, R(n) the linear space of all real n-tuples, and R(infinity) the linear space of all infinite real sequences x = (x sub 1, x sub 2,...). Let P sub n :R(infinity) approaches R(n) be the projection operator with P sub n (x) = (x sub 1,...,x sub n). Let p(infinity) be a probability measure on the smallest sigma-ring of subsets of R(infinity) which includes all of the cylinder sets P sub n(-1) (B sub n), where B sub n is an arbitrary Borel subset of R(n). Let p sub n be the marginal distribution of p(infinity) on R(n), so p sub n(B sub n) = p(infinity)(P sub n to the -1(B sub n)) for each B sub n. A measure on R(n) is isotropic if it is invariant under all orthogonal transformations of R(n). All members of the set of all isotropic probability distributions on R(n) are described. The result calls into question both stochastic inversion and Bayesian inference, as currently used in many geophysical inverse problems.

  5. The quantitative assessment of epicardial fat distribution on human hearts: Implications for epicardial electrophysiology.

    PubMed

    Mattson, Alexander R; Soto, Mario J; Iaizzo, Paul A

    2018-07-01

    Epicardial electrophysiological procedures rely on dependable interfacing with the myocardial tissue. For example, epicardial pacing systems must generate sustainable chronic pacing capture, while epicardial ablations must effectively deliver energy to the target hyper-excitable myocytes. The human heart has a significant adipose layer which may impede epicardial procedures. The objective of this study was to quantitatively assess the relative location of epicardial adipose on the human heart, to define locations where epicardial therapies might be performed successfully. We studied perfusion-fixed human hearts (n = 105) in multiple isolated planes including: left ventricular margin, diaphragmatic surface, and anterior right ventricle. Relative adipose distribution was quantitatively assessed via planar images, using a custom-generated image analysis algorithm. In these specimens, 76.7 ± 13.8% of the left ventricular margin, 72.7 ± 11.3% of the diaphragmatic surface, and 92.1 ± 8.7% of the anterior right margin were covered with superficial epicardial adipose layers. Percent adipose coverage significantly increased with age (P < 0.001) and history of coronary artery disease (P < 0.05). No significant relationships were identified between relative percent adipose coverage and gender, body weight or height, BMI, history of hypertension, and/or history of congestive heart failure. Additionally, we describe two-dimensional probability distributions of epicardial adipose coverage for each of the three analysis planes. In this study, we detail the quantitative assessment and probabilistic mapping of the distribution of superficial epicardial adipose on the adult human heart. These findings have implications relative to performing epicardial procedures and/or designing procedures or tools to successfully perform such treatments. Clin. Anat. 31:661-666, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  6. Bayesian model selection: Evidence estimation based on DREAM simulation and bridge sampling

    NASA Astrophysics Data System (ADS)

    Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.

    2017-04-01

    Bayesian inference has found widespread application in Earth and Environmental Systems Modeling, providing an effective tool for prediction, data assimilation, parameter estimation, uncertainty analysis and hypothesis testing. Under multiple competing hypotheses, the Bayesian approach also provides an attractive alternative to traditional information criteria (e.g. AIC, BIC) for model selection. The key variable for Bayesian model selection is the evidence (or marginal likelihood) that is the normalizing constant in the denominator of Bayes theorem; while it is fundamental for model selection, the evidence is not required for Bayesian inference. It is computed for each hypothesis (model) by averaging the likelihood function over the prior parameter distribution, rather than maximizing it as by information criteria; the larger a model evidence the more support it receives among a collection of hypothesis as the simulated values assign relatively high probability density to the observed data. Hence, the evidence naturally acts as an Occam's razor, preferring simpler and more constrained models against the selection of over-fitted ones by information criteria that incorporate only the likelihood maximum. Since it is not particularly easy to estimate the evidence in practice, Bayesian model selection via the marginal likelihood has not yet found mainstream use. We illustrate here the properties of a new estimator of the Bayesian model evidence, which provides robust and unbiased estimates of the marginal likelihood; the method is coined Gaussian Mixture Importance Sampling (GMIS). GMIS uses multidimensional numerical integration of the posterior parameter distribution via bridge sampling (a generalization of importance sampling) of a mixture distribution fitted to samples of the posterior distribution derived from the DREAM algorithm (Vrugt et al., 2008; 2009). Some illustrative examples are presented to show the robustness and superiority of the GMIS estimator with respect to other commonly used approaches in the literature.

  7. Organic-rich sediments in ventilated deep-sea environments: Relationship to climate, sea level, and trophic changes

    NASA Astrophysics Data System (ADS)

    Bertrand, P.; Pedersen, T. F.; Schneider, R.; Shimmield, G.; Lallier-Verges, E.; Disnar, J. R.; Massias, D.; Villanueva, J.; Tribovillard, N.; Huc, A. Y.; Giraud, X.; Pierre, C.; VéNec-Peyré, M.-T.

    2003-02-01

    Sediments on the Namibian Margin in the SE Atlantic between water depths of ˜1000 and ˜3600 m are highly enriched in hydrocarbon-prone organic matter. Such sedimentation has occurred for more than 2 million years and is geographically distributed over hundreds of kilometers along the margin, so that the sediments of this region contain a huge concentrated stock of organic carbon. It is shown here that most of the variability in organic content is due to relative dilution by buried carbonates. This reflects both export productivity and diagenetic dissolution, not differences in either water column or bottom water anoxia and related enhanced preservation of organic matter. These observations offer a new mechanism for the formation of potential source rocks in a well-ventilated open ocean, in this case the South Atlantic. The organic richness is discussed in terms of a suite of probable controls including local wind-driven productivity (upwelling), trophic conditions, transfer efficiency, diagenetic processes, and climate-related sea level and deep circulation. The probability of past occurrences of such organic-rich facies in equivalent oceanographic settings at the edge of large oceanic basins should be carefully considered in deep offshore exploration.

  8. A computational framework to empower probabilistic protein design

    PubMed Central

    Fromer, Menachem; Yanover, Chen

    2008-01-01

    Motivation: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. Results: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future. Contact: fromer@cs.huji.ac.il PMID:18586717

  9. Dose-volume histogram prediction using density estimation.

    PubMed

    Skarpman Munter, Johanna; Sjölund, Jens

    2015-09-07

    Knowledge of what dose-volume histograms can be expected for a previously unseen patient could increase consistency and quality in radiotherapy treatment planning. We propose a machine learning method that uses previous treatment plans to predict such dose-volume histograms. The key to the approach is the framing of dose-volume histograms in a probabilistic setting.The training consists of estimating, from the patients in the training set, the joint probability distribution of some predictive features and the dose. The joint distribution immediately provides an estimate of the conditional probability of the dose given the values of the predictive features. The prediction consists of estimating, from the new patient, the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimate of the dose-volume histogram.To illustrate how the proposed method relates to previously proposed methods, we use the signed distance to the target boundary as a single predictive feature. As a proof-of-concept, we predicted dose-volume histograms for the brainstems of 22 acoustic schwannoma patients treated with stereotactic radiosurgery, and for the lungs of 9 lung cancer patients treated with stereotactic body radiation therapy. Comparing with two previous attempts at dose-volume histogram prediction we find that, given the same input data, the predictions are similar.In summary, we propose a method for dose-volume histogram prediction that exploits the intrinsic probabilistic properties of dose-volume histograms. We argue that the proposed method makes up for some deficiencies in previously proposed methods, thereby potentially increasing ease of use, flexibility and ability to perform well with small amounts of training data.

  10. Neutrino mass priors for cosmology from random matrices

    DOE PAGES

    Long, Andrew J.; Raveri, Marco; Hu, Wayne; ...

    2018-02-13

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σm ν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π(Σm ν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix M ν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution overmore » M ν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σm ν that we interpret as a Bayesian prior probability π(Σm ν). Assuming a basis-invariant probability distribution on M ν, also known as the anarchy hypothesis, we find that π(Σm ν) peaks close to the smallest Σm ν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π(Σm ν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. In conclusion, we present fitting functions for π(Σm ν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.« less

  11. Neutrino mass priors for cosmology from random matrices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Andrew J.; Raveri, Marco; Hu, Wayne

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σm ν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π(Σm ν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix M ν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution overmore » M ν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σm ν that we interpret as a Bayesian prior probability π(Σm ν). Assuming a basis-invariant probability distribution on M ν, also known as the anarchy hypothesis, we find that π(Σm ν) peaks close to the smallest Σm ν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π(Σm ν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. In conclusion, we present fitting functions for π(Σm ν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.« less

  12. Under conditions of large geometric miss, tumor control probability can be higher for static gantry intensity-modulated radiation therapy compared to volume-modulated arc therapy for prostate cancer.

    PubMed

    Balderson, Michael; Brown, Derek; Johnson, Patricia; Kirkby, Charles

    2016-01-01

    The purpose of this work was to compare static gantry intensity-modulated radiation therapy (IMRT) with volume-modulated arc therapy (VMAT) in terms of tumor control probability (TCP) under scenarios involving large geometric misses, i.e., those beyond what are accounted for when margin expansion is determined. Using a planning approach typical for these treatments, a linear-quadratic-based model for TCP was used to compare mean TCP values for a population of patients who experiences a geometric miss (i.e., systematic and random shifts of the clinical target volume within the planning target dose distribution). A Monte Carlo approach was used to account for the different biological sensitivities of a population of patients. Interestingly, for errors consisting of coplanar systematic target volume offsets and three-dimensional random offsets, static gantry IMRT appears to offer an advantage over VMAT in that larger shift errors are tolerated for the same mean TCP. For example, under the conditions simulated, erroneous systematic shifts of 15mm directly between or directly into static gantry IMRT fields result in mean TCP values between 96% and 98%, whereas the same errors on VMAT plans result in mean TCP values between 45% and 74%. Random geometric shifts of the target volume were characterized using normal distributions in each Cartesian dimension. When the standard deviations were doubled from those values assumed in the derivation of the treatment margins, our model showed a 7% drop in mean TCP for the static gantry IMRT plans but a 20% drop in TCP for the VMAT plans. Although adding a margin for error to a clinical target volume is perhaps the best approach to account for expected geometric misses, this work suggests that static gantry IMRT may offer a treatment that is more tolerant to geometric miss errors than VMAT. Copyright © 2016 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  13. Independent Component Analysis of Textures

    NASA Technical Reports Server (NTRS)

    Manduchi, Roberto; Portilla, Javier

    2000-01-01

    A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.

  14. MO-FG-CAMPUS-TeP2-04: Optimizing for a Specified Target Coverage Probability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, A

    2016-06-15

    Purpose: The purpose of this work is to develop a method for inverse planning of radiation therapy margins. When using this method the user specifies a desired target coverage probability and the system optimizes to meet the demand without any explicit specification of margins to handle setup uncertainty. Methods: The method determines which voxels to include in an optimization function promoting target coverage in order to achieve a specified target coverage probability. Voxels are selected in a way that retains the correlation between them: The target is displaced according to the setup errors and the voxels to include are selectedmore » as the union of the displaced target regions under the x% best scenarios according to some quality measure. The quality measure could depend on the dose to the considered structure alone or could depend on the dose to multiple structures in order to take into account correlation between structures. Results: A target coverage function was applied to the CTV of a prostate case with prescription 78 Gy and compared to conventional planning using a DVH function on the PTV. Planning was performed to achieve 90% probability of CTV coverage. The plan optimized using the coverage probability function had P(D98 > 77.95 Gy) = 0.97 for the CTV. The PTV plan using a constraint on minimum DVH 78 Gy at 90% had P(D98 > 77.95) = 0.44 for the CTV. To match the coverage probability optimization, the DVH volume parameter had to be increased to 97% which resulted in 0.5 Gy higher average dose to the rectum. Conclusion: Optimizing a target coverage probability is an easily used method to find a margin that achieves the desired coverage probability. It can lead to reduced OAR doses at the same coverage probability compared to planning with margins and DVH functions.« less

  15. Cenozoic Source-to-Sink of the African margin of the Equatorial Atlantic

    NASA Astrophysics Data System (ADS)

    Rouby, Delphine; Chardon, Dominique; Huyghe, Damien; Guillocheau, François; Robin, Cecile; Loparev, Artiom; Ye, Jing; Dall'Asta, Massimo; Grimaud, Jean-Louis

    2016-04-01

    The objective of the Transform Source to Sink Project (TS2P) is to link the dynamics of the erosion of the West African Craton to the offshore sedimentary basins of the African margin of the Equatorial Atlantic at geological time scales. This margin, alternating transform and oblique segments from Guinea to Nigeria, shows a strong structural variability in the margin width, continental geology and relief, drainage networks and subsidence/accumulation patterns. We analyzed this system combining onshore geology and geomorphology as well as offshore sub-surface data. Mapping and regional correlation of dated lateritic paleo-landscape remnants allows us to reconstruct two physiographic configurations of West Africa during the Cenozoic. We corrected those reconstitutions from flexural isostasy related to the subsequent erosion. These geometries show that the present-day drainage organization stabilized by at least 29 Myrs ago (probably by 34 Myr) revealing the antiquity of the Senegambia, Niger and Volta catchments toward the Atlantic as well as of the marginal upwarp currently forming a continental divide. The drainage rearrangement that lead to this drainage organization was primarily enhanced by the topographic growth of the Hoggar swell and caused a major stratigraphic turnover along the Equatorial margin of West Africa. Elevation differences between paleo-landscape remnants give access to the spatial and temporal distribution of denudation for 3 time-increments since 45 Myrs. From this, we estimate the volumes of sediments and associated lithologies exported by the West African Craton toward different segments of the margin, taking into account the type of eroded bedrock and the successive drainage reorganizations. We compare these data to Cenozoic accumulation histories in the basins and discuss their stratigraphic expression according to the type of margin segment they are preserved in.

  16. Stochastic Computations in Cortical Microcircuit Models

    PubMed Central

    Maass, Wolfgang

    2013-01-01

    Experimental data from neuroscience suggest that a substantial amount of knowledge is stored in the brain in the form of probability distributions over network states and trajectories of network states. We provide a theoretical foundation for this hypothesis by showing that even very detailed models for cortical microcircuits, with data-based diverse nonlinear neurons and synapses, have a stationary distribution of network states and trajectories of network states to which they converge exponentially fast from any initial state. We demonstrate that this convergence holds in spite of the non-reversibility of the stochastic dynamics of cortical microcircuits. We further show that, in the presence of background network oscillations, separate stationary distributions emerge for different phases of the oscillation, in accordance with experimentally reported phase-specific codes. We complement these theoretical results by computer simulations that investigate resulting computation times for typical probabilistic inference tasks on these internally stored distributions, such as marginalization or marginal maximum-a-posteriori estimation. Furthermore, we show that the inherent stochastic dynamics of generic cortical microcircuits enables them to quickly generate approximate solutions to difficult constraint satisfaction problems, where stored knowledge and current inputs jointly constrain possible solutions. This provides a powerful new computing paradigm for networks of spiking neurons, that also throws new light on how networks of neurons in the brain could carry out complex computational tasks such as prediction, imagination, memory recall and problem solving. PMID:24244126

  17. A Monte Carlo study of the impact of the choice of rectum volume definition on estimates of equivalent uniform doses and the volume parameter

    NASA Astrophysics Data System (ADS)

    Kvinnsland, Yngve; Muren, Ludvig Paul; Dahl, Olav

    2004-08-01

    Calculations of normal tissue complication probability (NTCP) values for the rectum are difficult because it is a hollow, non-rigid, organ. Finding the true cumulative dose distribution for a number of treatment fractions requires a CT scan before each treatment fraction. This is labour intensive, and several surrogate distributions have therefore been suggested, such as dose wall histograms, dose surface histograms and histograms for the solid rectum, with and without margins. In this study, a Monte Carlo method is used to investigate the relationships between the cumulative dose distributions based on all treatment fractions and the above-mentioned histograms that are based on one CT scan only, in terms of equivalent uniform dose. Furthermore, the effect of a specific choice of histogram on estimates of the volume parameter of the probit NTCP model was investigated. It was found that the solid rectum and the rectum wall histograms (without margins) gave equivalent uniform doses with an expected value close to the values calculated from the cumulative dose distributions in the rectum wall. With the number of patients available in this study the standard deviations of the estimates of the volume parameter were large, and it was not possible to decide which volume gave the best estimates of the volume parameter, but there were distinct differences in the mean values of the values obtained.

  18. Combined Recipe for Clinical Target Volume and Planning Target Volume Margins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stroom, Joep, E-mail: joep.stroom@fundacaochampalimaud.pt; Gilhuijs, Kenneth; Vieira, Sandra

    2014-03-01

    Purpose: To develop a combined recipe for clinical target volume (CTV) and planning target volume (PTV) margins. Methods and Materials: A widely accepted PTV margin recipe is M{sub geo} = aΣ{sub geo} + bσ{sub geo}, with Σ{sub geo} and σ{sub geo} standard deviations (SDs) representing systematic and random geometric uncertainties, respectively. On the basis of histopathology data of breast and lung tumors, we suggest describing the distribution of microscopic islets around the gross tumor volume (GTV) by a half-Gaussian with SD Σ{sub micro}, yielding as possible CTV margin recipe: M{sub micro} = ƒ(N{sub i}) × Σ{sub micro}, with N{sub i}more » the average number of microscopic islets per patient. To determine ƒ(N{sub i}), a computer model was developed that simulated radiation therapy of a spherical GTV with isotropic distribution of microscopic disease in a large group of virtual patients. The minimal margin that yielded D{sub min} <95% in maximally 10% of patients was calculated for various Σ{sub micro} and N{sub i}. Because Σ{sub micro} is independent of Σ{sub geo}, we propose they should be added quadratically, yielding for a combined GTV-to-PTV margin recipe: M{sub GTV-PTV} = √([aΣ{sub geo}]{sup 2} + [ƒ(N{sub i})Σ{sub micro}]{sup 2}) + bσ{sub geo}. This was validated by the computer model through numerous simultaneous simulations of microscopic and geometric uncertainties. Results: The margin factor ƒ(N{sub i}) in a relevant range of Σ{sub micro} and N{sub i} can be given by: ƒ(N{sub i}) = 1.4 + 0.8log(N{sub i}). Filling in the other factors found in our simulations (a = 2.1 and b = 0.8) yields for the combined recipe: M{sub GTV-PTV} = √((2.1Σ{sub geo}){sup 2} + ([1.4 + 0.8log(N{sub i})] × Σ{sub micro}){sup 2}) + 0.8σ{sub geo}. The average margin difference between the simultaneous simulations and the above recipe was 0.2 ± 0.8 mm (1 SD). Calculating M{sub geo} and M{sub micro} separately and adding them linearly overestimated PTVs by on average 5 mm. Margin recipes based on tumor control probability (TCP) instead of D{sub min} criteria yielded similar results. Conclusions: A general recipe for GTV-to-PTV margins is proposed, which shows that CTV and PTV margins should be added in quadrature instead of linearly.« less

  19. Characterizing the Lyman-alpha forest flux probability distribution function using Legendre polynomials

    NASA Astrophysics Data System (ADS)

    Cieplak, Agnieszka; Slosar, Anze

    2017-01-01

    The Lyman-alpha forest has become a powerful cosmological probe of the underlying matter distribution at high redshift. It is a highly non-linear field with much information present beyond the two-point statistics of the power spectrum. The flux probability distribution function (PDF) in particular has been used as a successful probe of small-scale physics. In addition to the cosmological evolution however, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over the binned PDF as is commonly done. Since the n-th coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. In addition, we use hydrodynamic cosmological simulations to demonstrate that in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a finite small number of well-measured quantities.

  20. A statistical model for analyzing the rotational error of single isocenter for multiple targets technique.

    PubMed

    Chang, Jenghwa

    2017-06-01

    To develop a statistical model that incorporates the treatment uncertainty from the rotational error of the single isocenter for multiple targets technique, and calculates the extra PTV (planning target volume) margin required to compensate for this error. The random vector for modeling the setup (S) error in the three-dimensional (3D) patient coordinate system was assumed to follow a 3D normal distribution with a zero mean, and standard deviations of σ x , σ y , σ z . It was further assumed that the rotation of clinical target volume (CTV) about the isocenter happens randomly and follows a three-dimensional (3D) independent normal distribution with a zero mean and a uniform standard deviation of σ δ . This rotation leads to a rotational random error (R), which also has a 3D independent normal distribution with a zero mean and a uniform standard deviation of σ R equal to the product of σδπ180 and dI⇔T, the distance between the isocenter and CTV. Both (S and R) random vectors were summed, normalized, and transformed to the spherical coordinates to derive the Chi distribution with three degrees of freedom for the radial coordinate of S+R. PTV margin was determined using the critical value of this distribution for a 0.05 significance level so that 95% of the time the treatment target would be covered by the prescription dose. The additional PTV margin required to compensate for the rotational error was calculated as a function of σ R and dI⇔T. The effect of the rotational error is more pronounced for treatments that require high accuracy/precision like stereotactic radiosurgery (SRS) or stereotactic body radiotherapy (SBRT). With a uniform 2-mm PTV margin (or σ x = σ y = σ z = 0.715 mm), a σ R = 0.328 mm will decrease the CTV coverage probability from 95.0% to 90.9%, or an additional 0.2-mm PTV margin is needed to prevent this loss of coverage. If we choose 0.2 mm as the threshold, any σ R > 0.328 mm will lead to an extra PTV margin that cannot be ignored, and the maximal σ δ that can be ignored is 0.45° (or 0.0079 rad ) for dI⇔T = 50 mm or 0.23° (or 0.004 rad ) for dI⇔T = 100 mm. The rotational error cannot be ignored for high-accuracy/-precision treatments like SRS/SBRT, particularly when the distance between the isocenter and target is large. © 2017 American Association of Physicists in Medicine.

  1. Deformation from the 1989 Loma Prieta earthquake near the southwest margin of the Santa Clara Valley, California

    USGS Publications Warehouse

    Schmidt, Kevin M.; Ellen, Stephen D.; Peterson, David M.

    2014-01-01

    To gain additional measurement of any permanent ground deformation that accompanied this damage, we compiled and conducted post-earthquake surveys along two 5-km lines of horizontal control and a 15-km level line. Measurements of horizontal distortion indicate approximately 0.1 m shortening in a NE-SW direction across the valley margin, similar to the amount measured in the channel lining. Evaluation of precise leveling by the National Geodetic Survey showed a downwarp, with an amplitude of >0.1 m over a span of >12 km, that resembled regional geodetic models of coseismic deformation. Although the leveling indicates broad, regional warping, abrupt discontinuities characteristic of faulting characterize both the broad-scale distribution of damage and the local deformation of the channel lining. Reverse movement largely along preexisting faults and probably enhanced significantly by warping combined with enhanced ground shaking, produced the documented coseismic ground deformation.

  2. Sr-Nd-Pb isotope systematics of the Permian volcanic rocks in the northern margin of the Alxa Block (the Shalazhashan Belt) and comparisons with the nearby regions: Implications for a Permian rift setting?

    NASA Astrophysics Data System (ADS)

    Shi, Guanzhong; Wang, Hua; Liu, Entao; Huang, Chuanyan; Zhao, Jianxin; Song, Guangzeng; Liang, Chao

    2018-04-01

    The petrogenesis of the Permian magmatic rocks in the Shalazhashan Belt is helpful for us to understand the tectonic evolution of the Central Asian Orogenic Belt (CAOB) in the northern margin of the Alxa Block. The Permian volcanic rocks in the Shalazhashan Belt include basalts, trachyandesites and trachydacites. Our study shows that two basalt samples have negative εNd(t) values (-5.4 to -1.5) and higher radiogenic Pb values, which are relevant to the ancient subcontinental lithospheric mantle. One basalt sample has positive εNd(t) value (+10) representing mafic juvenile crust and is derived from depleted asthenosphere. The trachyandesites are dated at 284 ± 3 Ma with εNd(t) = +2.7 to +8.0; ISr = 0.7052 to 0.7057, and they are generated by different degrees of mixing between mafic magmas and crustal melts. The trachydacites have high εNd(t) values and slightly higher ISr contents, suggesting the derivation from juvenile sources with crustal contamination. The isotopic comparisons of the Permian magmatic rocks of the Shalazhashan Belt, the Nuru-Langshan Belt (representing the northern margin of the Alxa Block), the Solonker Belt (Mandula area) and the northern margin of the North China Craton (Bayan Obo area) indicate that the radiogenic isotopic compositions have an increasingly evolved trend from the south (the northern margins of the Alxa Block and the North China Craton) to the north (the Shalazhashan Belt and the Solonker Belt). Three end-member components are involved to generate the Permian magmatic rocks: the ancient subcontinental lithospheric mantle, the mafic juvenile crust or newly underplated mafic rocks that were originated from depleted asthenosphere, and the ancient crust. The rocks correlative with the mafic juvenile crust or newly underplated mafic rocks are predominantly distributed along the Shalazhashan Belt and the Solonker Belt, and the rocks derived from ancient, enriched subcontinental lithospheric mantle are mainly distributed along the northern margins of the Alxa Block and the North China Craton. The magmatic rock types, isotopic features and their temporal, spatial distributions suggest an extensional regime probably related to rifting.

  3. Maps showing gas-hydrate distribution off the east coast of the United States

    USGS Publications Warehouse

    Dillon, William P.; Fehlhaber, Kristen L.; Coleman, Dwight F.; Lee, Myung W.; Hutchinson, Deborah R.

    1995-01-01

    These maps present the inferred distribution of natural-gas hydrate within the sediments of the eastern United States continental margin (Exclusive Economic Zone) in the offshore region from Georgia to New Jersey (fig. 1). The maps, which were created on the basis of seismic interpretations, represent the first attempt to map volume estimates for gas hydrate. Gas hydrate forms a large reservoir for methane in oceanic sediments. Therefore it potentially may represent a future source of energy and it may influence climate change because methane is a very effective greenhouse gas. Hydrate breakdown probably is a controlling factor for sea-floor landslides, and its presence has significant effect on the acoustic velocity of sea-floor sediments.

  4. Statistical properties of short-selling and margin-trading activities and their impacts on returns in the Chinese stock markets

    NASA Astrophysics Data System (ADS)

    Gao, Yan; Gao, Yao

    2015-11-01

    We investigate the collective behaviors of short-selling and margin-trading between Chinese stocks and their impacts on the co-movements of stock returns by cross-correlation and partial correlation analyses. We find that the collective behaviors of margin-trading are largely attributed to the index cohesive force, while those of short-selling are mainly due to some direct interactions between stocks. Interestingly, the dominant role the finance industry plays in the collective behaviors of short-selling could make it more important in affecting the co-movement structure of stock returns by strengthening its relationship with the market index. By detecting the volume-return and volume-volatility relationships, we find that the investors of the two leverage activities are positively triggered by individual stock volatility first, and next, at the return level, margin-buyers show trend-following properties, while short-sellers are probably informative traders who trade on the information impulse of specific firms. However, the return predictability of the two leverage trading activities and their impacts on stock volatility are not significant. Moreover, both tails of the cumulative distributions of the two leverage trading activities are found following the stretched exponential law better than the power-law.

  5. N-mixture models for estimating population size from spatially replicated counts

    USGS Publications Warehouse

    Royle, J. Andrew

    2004-01-01

    Spatial replication is a common theme in count surveys of animals. Such surveys often generate sparse count data from which it is difficult to estimate population size while formally accounting for detection probability. In this article, i describe a class of models (n-mixture models) which allow for estimation of population size from such data. The key idea is to view site-specific population sizes, n, as independent random variables distributed according to some mixing distribution (e.g., Poisson). Prior parameters are estimated from the marginal likelihood of the data, having integrated over the prior distribution for n. Carroll and lombard (1985, journal of american statistical association 80, 423-426) proposed a class of estimators based on mixing over a prior distribution for detection probability. Their estimator can be applied in limited settings, but is sensitive to prior parameter values that are fixed a priori. Spatial replication provides additional information regarding the parameters of the prior distribution on n that is exploited by the n-mixture models and which leads to reasonable estimates of abundance from sparse data. A simulation study demonstrates superior operating characteristics (bias, confidence interval coverage) of the n-mixture estimator compared to the caroll and lombard estimator. Both estimators are applied to point count data on six species of birds illustrating the sensitivity to choice of prior on p and substantially different estimates of abundance as a consequence.

  6. Probabilistic Design of a Mars Sample Return Earth Entry Vehicle Thermal Protection System

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Mitcheltree, Robert A.

    2002-01-01

    The driving requirement for design of a Mars Sample Return mission is to assure containment of the returned samples. Designing to, and demonstrating compliance with, such a requirement requires physics based tools that establish the relationship between engineer's sizing margins and probabilities of failure. The traditional method of determining margins on ablative thermal protection systems, while conservative, provides little insight into the actual probability of an over-temperature during flight. The objective of this paper is to describe a new methodology for establishing margins on sizing the thermal protection system (TPS). Results of this Monte Carlo approach are compared with traditional methods.

  7. A theoretically consistent stochastic cascade for temporal disaggregation of intermittent rainfall

    NASA Astrophysics Data System (ADS)

    Lombardo, F.; Volpi, E.; Koutsoyiannis, D.; Serinaldi, F.

    2017-06-01

    Generating fine-scale time series of intermittent rainfall that are fully consistent with any given coarse-scale totals is a key and open issue in many hydrological problems. We propose a stationary disaggregation method that simulates rainfall time series with given dependence structure, wet/dry probability, and marginal distribution at a target finer (lower-level) time scale, preserving full consistency with variables at a parent coarser (higher-level) time scale. We account for the intermittent character of rainfall at fine time scales by merging a discrete stochastic representation of intermittency and a continuous one of rainfall depths. This approach yields a unique and parsimonious mathematical framework providing general analytical formulations of mean, variance, and autocorrelation function (ACF) for a mixed-type stochastic process in terms of mean, variance, and ACFs of both continuous and discrete components, respectively. To achieve the full consistency between variables at finer and coarser time scales in terms of marginal distribution and coarse-scale totals, the generated lower-level series are adjusted according to a procedure that does not affect the stochastic structure implied by the original model. To assess model performance, we study rainfall process as intermittent with both independent and dependent occurrences, where dependence is quantified by the probability that two consecutive time intervals are dry. In either case, we provide analytical formulations of main statistics of our mixed-type disaggregation model and show their clear accordance with Monte Carlo simulations. An application to rainfall time series from real world is shown as a proof of concept.

  8. The Geographic Distribution of a Tropical Montane Bird Is Limited by a Tree: Acorn Woodpeckers (Melanerpes formicivorus) and Colombian Oaks (Quercus humboldtii) in the Northern Andes

    PubMed Central

    2015-01-01

    Species distributions are limited by a complex array of abiotic and biotic factors. In general, abiotic (climatic) factors are thought to explain species’ broad geographic distributions, while biotic factors regulate species’ abundance patterns at local scales. We used species distribution models to test the hypothesis that a biotic interaction with a tree, the Colombian oak (Quercus humboldtii), limits the broad-scale distribution of the Acorn Woodpecker (Melanerpes formicivorus) in the Northern Andes of South America. North American populations of Acorn Woodpeckers consume acorns from Quercus oaks and are limited by the presence of Quercus oaks. However, Acorn Woodpeckers in the Northern Andes seldom consume Colombian oak acorns (though may regularly drink sap from oak trees) and have been observed at sites without Colombian oaks, the sole species of Quercus found in South America. We found that climate-only models overpredicted Acorn Woodpecker distribution, suggesting that suitable abiotic conditions (e.g. in northern Ecuador) exist beyond the woodpecker’s southern range margin. In contrast, models that incorporate Colombian oak presence outperformed climate-only models and more accurately predicted the location of the Acorn Woodpecker’s southern range margin in southern Colombia. These findings support the hypothesis that a biotic interaction with Colombian oaks sets Acorn Woodpecker’s broad-scale geographic limit in South America, probably because Acorn Woodpeckers rely on Colombian oaks as a food resource (possibly for the oak’s sap rather than for acorns). Although empirical examples of particular plants limiting tropical birds’ distributions are scarce, we predict that similar biotic interactions may play an important role in structuring the geographic distributions of many species of tropical montane birds with specialized foraging behavior. PMID:26083262

  9. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    NASA Astrophysics Data System (ADS)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  10. Multi-beam transmitter geometries for free-space optical communications

    NASA Astrophysics Data System (ADS)

    Tellez, Jason A.; Schmidt, Jason D.

    2010-02-01

    Free-space optical communications systems provide the opportunity to take advantage of higher data transfer rates and lower probability of intercept compared to radio-frequency communications. However, propagation through atmospheric turbulence, such as for airborne laser communication over long paths, results in intensity variations at the receiver and a corresponding degradation in bit error rate (BER) performance. Previous literature has shown that two transmitters, when separated sufficiently, can effectively average out the intensity varying effects of the atmospheric turbulence at the receiver. This research explores the impacts of adding more transmitters and the marginal reduction in the probability of signal fades while minimizing the overall transmitter footprint, an important design factor when considering an airborne communications system. Analytical results for the cumulative distribution function are obtained for tilt-only results, while wave-optics simulations are used to simulate the effects of scintillation. These models show that the probability of signal fade is reduced as the number of transmitters is increased.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, H; Chen, Z; Nath, R

    Purpose: kV fluoroscopic imaging combined with MV treatment beam imaging has been investigated for intrafractional motion monitoring and correction. It is, however, subject to additional kV imaging dose to normal tissue. To balance tracking accuracy and imaging dose, we previously proposed an adaptive imaging strategy to dynamically decide future imaging type and moments based on motion tracking uncertainty. kV imaging may be used continuously for maximal accuracy or only when the position uncertainty (probability of out of threshold) is high if a preset imaging dose limit is considered. In this work, we propose more accurate methods to estimate tracking uncertaintymore » through analyzing acquired data in real-time. Methods: We simulated motion tracking process based on a previously developed imaging framework (MV + initial seconds of kV imaging) using real-time breathing data from 42 patients. Motion tracking errors for each time point were collected together with the time point’s corresponding features, such as tumor motion speed and 2D tracking error of previous time points, etc. We tested three methods for error uncertainty estimation based on the features: conditional probability distribution, logistic regression modeling, and support vector machine (SVM) classification to detect errors exceeding a threshold. Results: For conditional probability distribution, polynomial regressions on three features (previous tracking error, prediction quality, and cosine of the angle between the trajectory and the treatment beam) showed strong correlation with the variation (uncertainty) of the mean 3D tracking error and its standard deviation: R-square = 0.94 and 0.90, respectively. The logistic regression and SVM classification successfully identified about 95% of tracking errors exceeding 2.5mm threshold. Conclusion: The proposed methods can reliably estimate the motion tracking uncertainty in real-time, which can be used to guide adaptive additional imaging to confirm the tumor is within the margin or initialize motion compensation if it is out of the margin.« less

  12. Potential benefits of dosimetric VMAT tracking verified with 3D film measurements.

    PubMed

    Crijns, Wouter; Defraene, Gilles; Van Herck, Hans; Depuydt, Tom; Haustermans, Karin; Maes, Frederik; Van den Heuvel, Frank

    2016-05-01

    To evaluate three different plan adaptation strategies using 3D film-stack dose measurements of both focal boost and hypofractionated prostate VMAT treatments. The adaptation strategies (a couch shift, geometric tracking, and dosimetric tracking) were applied for three realistic intrafraction prostate motions. A focal boost (35 × 2.2 and 35 × 2.7 Gy) and a hypofractionated (5 × 7.25 Gy) prostate VMAT plan were created for a heterogeneous phantom that allows for internal prostate motion. For these plans geometric tracking and dosimetric tracking were evaluated by ionization chamber (IC) point dose measurements (zero-D) and measurements using a stack of EBT3 films (3D). The geometric tracking applied translations, rotations, and scaling of the MLC aperture in response to realistic prostate motions. The dosimetric tracking additionally corrected the monitor units to resolve variations due to difference in depth, tissue heterogeneity, and MLC-aperture. The tracking was based on the positions of four fiducial points only. The film measurements were compared to the gold standard (i.e., IC measurements) and the planned dose distribution. Additionally, the 3D measurements were converted to dose volume histograms, tumor control probability, and normal tissue complication probability parameters (DVH/TCP/NTCP) as a direct estimate of clinical relevance of the proposed tracking. Compared to the planned dose distribution, measurements without prostate motion and tracking showed already a reduced homogeneity of the dose distribution. Adding prostate motion further blurs the DVHs for all treatment approaches. The clinical practice (no tracking) delivered the dose distribution inside the PTV but off target (CTV), resulting in boost dose errors up to 10%. The geometric and dosimetric tracking corrected the dose distribution's position. Moreover, the dosimetric tracking could achieve the planned boost DVH, but not the DVH of the more homogeneously irradiated prostate. A drawback of both the geometric and dosimetric tracking was a reduced MLC blocking caused by the rotational component of the MLC aperture corrections. Because of the used CTV to PTV margins and the high doses in the considered fractionation schemes, the TCP differed less than 0.02 from the planned value for all targets and all correction methods. The rectal NTCP constraints, however, could not be realized using any of these methods. The geometric and dosimetric tracking use only a limited input, but they deposit the dose distribution with higher geometric accuracy than the clinical practice. The latter case has boost dose errors up to 10%. The increased accuracy has a modest impact [Δ(NT)CP < 0.02] because of the applied margins and the high dose levels used. To allow further margin reduction tracking methods are vital. The proposed methodology could further be improved by implementing a rotational correction using collimator rotations.

  13. Local response of a glacier to annual filling and drainage of an ice-marginal lake

    USGS Publications Warehouse

    Walder, J.S.; Trabant, D.C.; Cunico, M.; Fountain, A.G.; Anderson, S.P.; Anderson, R. Scott; Malm, A.

    2006-01-01

    Ice-marginal Hidden Creek Lake, Alaska, USA, outbursts annually over the course of 2-3 days. As the lake fills, survey targets on the surface of the 'ice dam' (the glacier adjacent to the lake) move obliquely to the ice margin and rise substantially. As the lake drains, ice motion speeds up, becomes nearly perpendicular to the face of the ice dam, and the ice surface drops. Vertical movement of the ice dam probably reflects growth and decay of a wedge of water beneath the ice dam, in line with established ideas about jo??kulhlaup mechanics. However, the distribution of vertical ice movement, with a narrow (50-100 m wide) zone where the uplift rate decreases by 90%, cannot be explained by invoking flexure of the ice dam in a fashion analogous to tidal flexure of a floating glacier tongue or ice shelf. Rather, the zone of large uplift-rate gradient is a fault zone: ice-dam deformation is dominated by movement along high-angle faults that cut the ice dam through its entire thickness, with the sense of fault slip reversing as the lake drains. Survey targets spanning the zone of steep uplift gradient move relative to one another in a nearly reversible fashion as the lake fills and drains. The horizontal strain rate also undergoes a reversal across this zone, being compressional as the lake fills, but extensional as the lake drains. Frictional resistance to fault-block motion probably accounts for the fact that lake level falls measurably before the onset of accelerated horizontal motion and vertical downdrop. As the overall fault pattern is the same from year to year, even though ice is lost by calving, the faults must be regularly regenerated, probably by linkage of surface and bottom crevasses as ice is advected toward the lake basin.

  14. Statistics of cosmic density profiles from perturbation theory

    NASA Astrophysics Data System (ADS)

    Bernardeau, Francis; Pichon, Christophe; Codis, Sandrine

    2014-11-01

    The joint probability distribution function (PDF) of the density within multiple concentric spherical cells is considered. It is shown how its cumulant generating function can be obtained at tree order in perturbation theory as the Legendre transform of a function directly built in terms of the initial moments. In the context of the upcoming generation of large-scale structure surveys, it is conjectured that this result correctly models such a function for finite values of the variance. Detailed consequences of this assumption are explored. In particular the corresponding one-cell density probability distribution at finite variance is computed for realistic power spectra, taking into account its scale variation. It is found to be in agreement with Λ -cold dark matter simulations at the few percent level for a wide range of density values and parameters. Related explicit analytic expansions at the low and high density tails are given. The conditional (at fixed density) and marginal probability of the slope—the density difference between adjacent cells—and its fluctuations is also computed from the two-cell joint PDF; it also compares very well to simulations. It is emphasized that this could prove useful when studying the statistical properties of voids as it can serve as a statistical indicator to test gravity models and/or probe key cosmological parameters.

  15. Maximum magnitude (Mmax) in the central and eastern United States for the 2014 U.S. Geological Survey Hazard Model

    USGS Publications Warehouse

    Wheeler, Russell L.

    2016-01-01

    Probabilistic seismic‐hazard assessment (PSHA) requires an estimate of Mmax, the moment magnitude M of the largest earthquake that could occur within a specified area. Sparse seismicity hinders Mmax estimation in the central and eastern United States (CEUS) and tectonically similar regions worldwide (stable continental regions [SCRs]). A new global catalog of moderate‐to‐large SCR earthquakes is analyzed with minimal assumptions about enigmatic geologic controls on SCR Mmax. An earlier observation that SCR earthquakes of M 7.0 and larger occur in young (250–23 Ma) passive continental margins and associated rifts but not in cratons is not strongly supported by the new catalog. SCR earthquakes of M 7.5 and larger are slightly more numerous and reach slightly higher M in young passive margins and rifts than in cratons. However, overall histograms of M from young margins and rifts and from cratons are statistically indistinguishable. This conclusion is robust under uncertainties inM, the locations of SCR boundaries, and which of two available global SCR catalogs is used. The conclusion stems largely from recent findings that (1) large southeast Asian earthquakes once thought to be SCR were in actively deforming crust and (2) long escarpments in cratonic Australia were formed by prehistoric faulting. The 2014 seismic‐hazard model of the U.S. Geological Survey represents CEUS Mmax as four‐point probability distributions. The distributions have weighted averages of M 7.0 in cratons and M 7.4 in passive margins and rifts. These weighted averages are consistent with Mmax estimates of other SCR PSHAs of the CEUS, southeastern Canada, Australia, and India.

  16. Margin selection to compensate for loss of target dose coverage due to target motion during external‐beam radiation therapy of the lung

    PubMed Central

    Osei, Ernest; Barnett, Rob

    2015-01-01

    The aim of this study is to provide guidelines for the selection of external‐beam radiation therapy target margins to compensate for target motion in the lung during treatment planning. A convolution model was employed to predict the effect of target motion on the delivered dose distribution. The accuracy of the model was confirmed with radiochromic film measurements in both static and dynamic phantom modes. 502 unique patient breathing traces were recorded and used to simulate the effect of target motion on a dose distribution. A 1D probability density function (PDF) representing the position of the target throughout the breathing cycle was generated from each breathing trace obtained during 4D CT. Changes in the target D95 (the minimum dose received by 95% of the treatment target) due to target motion were analyzed and shown to correlate with the standard deviation of the PDF. Furthermore, the amount of target D95 recovered per millimeter of increased field width was also shown to correlate with the standard deviation of the PDF. The sensitivity of changes in dose coverage with respect to target size was also determined. Margin selection recommendations that can be used to compensate for loss of target D95 were generated based on the simulation results. These results are discussed in the context of clinical plans. We conclude that, for PDF standard deviations less than 0.4 cm with target sizes greater than 5 cm, little or no additional margins are required. Targets which are smaller than 5 cm with PDF standard deviations larger than 0.4 cm are most susceptible to loss of coverage. The largest additional required margin in this study was determined to be 8 mm. PACS numbers: 87.53.Bn, 87.53.Kn, 87.55.D‐, 87.55.Gh

  17. Generation of Stationary Non-Gaussian Time Histories with a Specified Cross-spectral Density

    DOE PAGES

    Smallwood, David O.

    1997-01-01

    The paper reviews several methods for the generation of stationary realizations of sampled time histories with non-Gaussian distributions and introduces a new method which can be used to control the cross-spectral density matrix and the probability density functions (pdfs) of the multiple input problem. Discussed first are two methods for the specialized case of matching the auto (power) spectrum, the skewness, and kurtosis using generalized shot noise and using polynomial functions. It is then shown that the skewness and kurtosis can also be controlled by the phase of a complex frequency domain description of the random process. The general casemore » of matching a target probability density function using a zero memory nonlinear (ZMNL) function is then covered. Next methods for generating vectors of random variables with a specified covariance matrix for a class of spherically invariant random vectors (SIRV) are discussed. Finally the general case of matching the cross-spectral density matrix of a vector of inputs with non-Gaussian marginal distributions is presented.« less

  18. Predictions from star formation in the multiverse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bousso, Raphael; Leichenauer, Stefan

    2010-03-15

    We compute trivariate probability distributions in the landscape, scanning simultaneously over the cosmological constant, the primordial density contrast, and spatial curvature. We consider two different measures for regulating the divergences of eternal inflation, and three different models for observers. In one model, observers are assumed to arise in proportion to the entropy produced by stars; in the others, they arise at a fixed time (5 or 10x10{sup 9} years) after star formation. The star formation rate, which underlies all our observer models, depends sensitively on the three scanning parameters. We employ a recently developed model of star formation in themore » multiverse, a considerable refinement over previous treatments of the astrophysical and cosmological properties of different pocket universes. For each combination of observer model and measure, we display all single and bivariate probability distributions, both with the remaining parameter(s) held fixed and marginalized. Our results depend only weakly on the observer model but more strongly on the measure. Using the causal diamond measure, the observed parameter values (or bounds) lie within the central 2{sigma} of nearly all probability distributions we compute, and always within 3{sigma}. This success is encouraging and rather nontrivial, considering the large size and dimension of the parameter space. The causal patch measure gives similar results as long as curvature is negligible. If curvature dominates, the causal patch leads to a novel runaway: it prefers a negative value of the cosmological constant, with the smallest magnitude available in the landscape.« less

  19. A New Multivariate Approach in Generating Ensemble Meteorological Forcings for Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2015-04-01

    Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.

  20. How extreme is extreme hourly precipitation?

    NASA Astrophysics Data System (ADS)

    Papalexiou, Simon Michael; Dialynas, Yannis G.; Pappas, Christoforos

    2016-04-01

    The importance of accurate representation of precipitation at fine time scales (e.g., hourly), directly associated with flash flood events, is crucial in hydrological design and prediction. The upper part of a probability distribution, known as the distribution tail, determines the behavior of extreme events. In general, and loosely speaking, tails can be categorized in two families: the subexponential and the hyperexponential family, with the first generating more intense and more frequent extremes compared to the latter. In past studies, the focus has been mainly on daily precipitation, with the Gamma distribution being the most popular model. Here, we investigate the behaviour of tails of hourly precipitation by comparing the upper part of empirical distributions of thousands of records with three general types of tails corresponding to the Pareto, Lognormal, and Weibull distributions. Specifically, we use thousands of hourly rainfall records from all over the USA. The analysis indicates that heavier-tailed distributions describe better the observed hourly rainfall extremes in comparison to lighter tails. Traditional representations of the marginal distribution of hourly rainfall may significantly deviate from observed behaviours of extremes, with direct implications on hydroclimatic variables modelling and engineering design.

  1. Climate Change Impact Assessment in Pacific North West Using Copula based Coupling of Temperature and Precipitation variables

    NASA Astrophysics Data System (ADS)

    Qin, Y.; Rana, A.; Moradkhani, H.

    2014-12-01

    The multi downscaled-scenario products allow us to better assess the uncertainty of the changes/variations of precipitation and temperature in the current and future periods. Joint Probability distribution functions (PDFs), of both the climatic variables, might help better understand the interdependence of the two, and thus in-turn help in accessing the future with confidence. Using the joint distribution of temperature and precipitation is also of significant importance in hydrological applications and climate change studies. In the present study, we have used multi-modelled statistically downscaled-scenario ensemble of precipitation and temperature variables using 2 different statistically downscaled climate dataset. The datasets used are, 10 Global Climate Models (GCMs) downscaled products from CMIP5 daily dataset, namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, leading to 2 ensemble time series from 20 GCM products. Thereafter the ensemble PDFs of both precipitation and temperature is evaluated for summer, winter, and yearly periods for all the 10 sub-basins across Columbia River Basin (CRB). Eventually, Copula is applied to establish the joint distribution of two variables enabling users to model the joint behavior of the variables with any level of correlation and dependency. Moreover, the probabilistic distribution helps remove the limitations on marginal distributions of variables in question. The joint distribution is then used to estimate the change trends of the joint precipitation and temperature in the current and future, along with estimation of the probabilities of the given change. Results have indicated towards varied change trends of the joint distribution of, summer, winter, and yearly time scale, respectively in all 10 sub-basins. Probabilities of changes, as estimated by the joint precipitation and temperature, will provide useful information/insights for hydrological and climate change predictions.

  2. Robust optimization based upon statistical theory.

    PubMed

    Sobotta, B; Söhn, M; Alber, M

    2010-08-01

    Organ movement is still the biggest challenge in cancer treatment despite advances in online imaging. Due to the resulting geometric uncertainties, the delivered dose cannot be predicted precisely at treatment planning time. Consequently, all associated dose metrics (e.g., EUD and maxDose) are random variables with a patient-specific probability distribution. The method that the authors propose makes these distributions the basis of the optimization and evaluation process. The authors start from a model of motion derived from patient-specific imaging. On a multitude of geometry instances sampled from this model, a dose metric is evaluated. The resulting pdf of this dose metric is termed outcome distribution. The approach optimizes the shape of the outcome distribution based on its mean and variance. This is in contrast to the conventional optimization of a nominal value (e.g., PTV EUD) computed on a single geometry instance. The mean and variance allow for an estimate of the expected treatment outcome along with the residual uncertainty. Besides being applicable to the target, the proposed method also seamlessly includes the organs at risk (OARs). The likelihood that a given value of a metric is reached in the treatment is predicted quantitatively. This information reveals potential hazards that may occur during the course of the treatment, thus helping the expert to find the right balance between the risk of insufficient normal tissue sparing and the risk of insufficient tumor control. By feeding this information to the optimizer, outcome distributions can be obtained where the probability of exceeding a given OAR maximum and that of falling short of a given target goal can be minimized simultaneously. The method is applicable to any source of residual motion uncertainty in treatment delivery. Any model that quantifies organ movement and deformation in terms of probability distributions can be used as basis for the algorithm. Thus, it can generate dose distributions that are robust against interfraction and intrafraction motion alike, effectively removing the need for indiscriminate safety margins.

  3. Point and Condensed Hα Sources in the Interior of M33

    NASA Astrophysics Data System (ADS)

    Moody, J. Ward; Hintz, Eric G.; Roming, Peter; Joner, Michael D.; Bucklein, Brian

    2017-01-01

    A variety of interesting objects such as Wolf-Rayet stars, tight OB associations, planetary nebula, x-ray binaries, etc. can be discovered as point or condensed sources in Hα surveys. How these objects distribute through a galaxy sheds light on the galaxy star formation rate and history, mass distribution, and dynamics. The nearby galaxy M33 is an excellent place to study the distribution of Hα-bright point sources in a flocculant spiral galaxy. We have reprocessed an archived WIYN continuum-subtracted Hα image of the inner 6.5' of the nearby galaxy M33 and, employing both eye and machine searches, have tabulated sources with a flux greater than 1 x 10-15 erg cm-2sec-1. We have identified 152 unresolved point sources and 122 marginally resolved condensed sources, 38 of which have not been previously cataloged. We present a map of these sources and discuss their probable identifications.

  4. A Directed Acyclic Graph-Large Margin Distribution Machine Model for Music Symbol Classification

    PubMed Central

    Wen, Cuihong; Zhang, Jing; Rebelo, Ana; Cheng, Fanyong

    2016-01-01

    Optical Music Recognition (OMR) has received increasing attention in recent years. In this paper, we propose a classifier based on a new method named Directed Acyclic Graph-Large margin Distribution Machine (DAG-LDM). The DAG-LDM is an improvement of the Large margin Distribution Machine (LDM), which is a binary classifier that optimizes the margin distribution by maximizing the margin mean and minimizing the margin variance simultaneously. We modify the LDM to the DAG-LDM to solve the multi-class music symbol classification problem. Tests are conducted on more than 10000 music symbol images, obtained from handwritten and printed images of music scores. The proposed method provides superior classification capability and achieves much higher classification accuracy than the state-of-the-art algorithms such as Support Vector Machines (SVMs) and Neural Networks (NNs). PMID:26985826

  5. A Directed Acyclic Graph-Large Margin Distribution Machine Model for Music Symbol Classification.

    PubMed

    Wen, Cuihong; Zhang, Jing; Rebelo, Ana; Cheng, Fanyong

    2016-01-01

    Optical Music Recognition (OMR) has received increasing attention in recent years. In this paper, we propose a classifier based on a new method named Directed Acyclic Graph-Large margin Distribution Machine (DAG-LDM). The DAG-LDM is an improvement of the Large margin Distribution Machine (LDM), which is a binary classifier that optimizes the margin distribution by maximizing the margin mean and minimizing the margin variance simultaneously. We modify the LDM to the DAG-LDM to solve the multi-class music symbol classification problem. Tests are conducted on more than 10000 music symbol images, obtained from handwritten and printed images of music scores. The proposed method provides superior classification capability and achieves much higher classification accuracy than the state-of-the-art algorithms such as Support Vector Machines (SVMs) and Neural Networks (NNs).

  6. Contemporary movements and tectonics on Canada's west coast: A discussion

    NASA Astrophysics Data System (ADS)

    Riddihough, Robin P.

    1982-06-01

    Evidence from published tidal records and geodetic relevelling data in British Columbia indicates that there is a consistent pattern of contemporary uplift on the outer coast (2 mm/yr) and subsidence on the inner coast (1-2 mm/yr). The zero uplift contour or "hinge-line" runs through Hecate Strait, Georgia Strait and Victoria. This pattern continues southwards into Washington State but is interrupted to the north by considerable uplift in southeastern Alaska. Although glacio-isostatic recovery has dominated vertical movements in the region over the last 10,000 years, the distribution and trend of the observed contemporary movements are not compatible with the pattern to be expected from this source and are most probably tectonic in origin. There is, however, no clear distinction between the movements seen opposite the Queen Charlotte transform margin and the Vancouver Island convergent margin. Comparison with movements observed at other active plate margins show that the pattern is essentially similar to that seen in association with subduction and convergence. The paradox that the vertical movement rates are much too great to explain observed geology and topography may be soluble by assuming that discontinuous lateral shifts of the movement pattern occur on a scale of hundreds of thousands of years.

  7. Nonparametric predictive inference for combining diagnostic tests with parametric copula

    NASA Astrophysics Data System (ADS)

    Muhammad, Noryanti; Coolen, F. P. A.; Coolen-Maturi, T.

    2017-09-01

    Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine and health care. The Receiver Operating Characteristic (ROC) curve is a popular statistical tool for describing the performance of diagnostic tests. The area under the ROC curve (AUC) is often used as a measure of the overall performance of the diagnostic test. In this paper, we interest in developing strategies for combining test results in order to increase the diagnostic accuracy. We introduce nonparametric predictive inference (NPI) for combining two diagnostic test results with considering dependence structure using parametric copula. NPI is a frequentist statistical framework for inference on a future observation based on past data observations. NPI uses lower and upper probabilities to quantify uncertainty and is based on only a few modelling assumptions. While copula is a well-known statistical concept for modelling dependence of random variables. A copula is a joint distribution function whose marginals are all uniformly distributed and it can be used to model the dependence separately from the marginal distributions. In this research, we estimate the copula density using a parametric method which is maximum likelihood estimator (MLE). We investigate the performance of this proposed method via data sets from the literature and discuss results to show how our method performs for different family of copulas. Finally, we briefly outline related challenges and opportunities for future research.

  8. Probable influence of early Carboniferous (Tournaisian-early Visean) geography on the development of Waulsortian and Waulsortian-like mounds

    NASA Astrophysics Data System (ADS)

    King, David T., Jr.

    1990-07-01

    All of the known Tournaisian-early Visean (ca. 360-348 Ma) age carbonate mud mounds (Waulsortian and Waulsortian-like mounds) developed in low paleolatitudes on the southern shelf margin of Laurussia and in the Laurussian interior seaway. The Tournaisian-early Visean geography probably prevented hurricanes, tropical storms, and winter storms from crossing the shelf margin or interior seaway where these mounds developed. Implications of the lack of storm energy on mound development are discussed.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crijns, Wouter, E-mail: wouter.crijns@uzleuven.be; Depuydt, Tom; Haustermans, Karin

    Purpose: To evaluate three different plan adaptation strategies using 3D film-stack dose measurements of both focal boost and hypofractionated prostate VMAT treatments. The adaptation strategies (a couch shift, geometric tracking, and dosimetric tracking) were applied for three realistic intrafraction prostate motions. Methods: A focal boost (35 × 2.2 and 35 × 2.7 Gy) and a hypofractionated (5 × 7.25 Gy) prostate VMAT plan were created for a heterogeneous phantom that allows for internal prostate motion. For these plans geometric tracking and dosimetric tracking were evaluated by ionization chamber (IC) point dose measurements (zero-D) and measurements using a stack of EBT3more » films (3D). The geometric tracking applied translations, rotations, and scaling of the MLC aperture in response to realistic prostate motions. The dosimetric tracking additionally corrected the monitor units to resolve variations due to difference in depth, tissue heterogeneity, and MLC-aperture. The tracking was based on the positions of four fiducial points only. The film measurements were compared to the gold standard (i.e., IC measurements) and the planned dose distribution. Additionally, the 3D measurements were converted to dose volume histograms, tumor control probability, and normal tissue complication probability parameters (DVH/TCP/NTCP) as a direct estimate of clinical relevance of the proposed tracking. Results: Compared to the planned dose distribution, measurements without prostate motion and tracking showed already a reduced homogeneity of the dose distribution. Adding prostate motion further blurs the DVHs for all treatment approaches. The clinical practice (no tracking) delivered the dose distribution inside the PTV but off target (CTV), resulting in boost dose errors up to 10%. The geometric and dosimetric tracking corrected the dose distribution’s position. Moreover, the dosimetric tracking could achieve the planned boost DVH, but not the DVH of the more homogeneously irradiated prostate. A drawback of both the geometric and dosimetric tracking was a reduced MLC blocking caused by the rotational component of the MLC aperture corrections. Because of the used CTV to PTV margins and the high doses in the considered fractionation schemes, the TCP differed less than 0.02 from the planned value for all targets and all correction methods. The rectal NTCP constraints, however, could not be realized using any of these methods. Conclusions: The geometric and dosimetric tracking use only a limited input, but they deposit the dose distribution with higher geometric accuracy than the clinical practice. The latter case has boost dose errors up to 10%. The increased accuracy has a modest impact [Δ(NT)CP < 0.02] because of the applied margins and the high dose levels used. To allow further margin reduction tracking methods are vital. The proposed methodology could further be improved by implementing a rotational correction using collimator rotations.« less

  10. Globalization and the price decline of illicit drugs.

    PubMed

    Costa Storti, Cláudia; De Grauwe, Paul

    2009-01-01

    This study aims at understanding the mechanisms underlying the dramatic decline of the retail prices of major drugs like cocaine and heroin during the past two decades. It also aims at analysing the implications of this decline for drug policies. We use a theoretical model to identify the possible causes of this price decline. This allows us to formulate the hypothesis that the major driving force behind the price decline is a reduction of the intermediation margin (the difference between the retail and producer prices). We also develop the hypothesis that globalization has been an important factor behind the decline of the intermediation margin. We then analyse the statistical information to test these hypotheses. We find that the decline in the retail prices of drugs is related to the strong decline in the intermediation margin in the drug business, and that globalization is the main driving force behind this phenomenon. Globalization has done so by increasing the efficiency of the distribution of drugs, by reducing the risk premium involved in dealing with drugs, and by increasing the degree of competition in the drug markets. We conclude that the cocaine and heroin price declines were due to a sharp fall in the intermediation margin, which was probably influenced by globalization. This phenomenon might have a strong impact on the effectiveness of drug policies, increasing the relative effectiveness of policies aiming at reducing the demand of drugs.

  11. K-feldspar megacryst accumulations formed by mechanical instabilities in magma chamber margins, Asha pluton, NW Argentina

    NASA Astrophysics Data System (ADS)

    Rocher, Sebastián; Alasino, Pablo H.; Grande, Marcos Macchioli; Larrovere, Mariano A.; Paterson, Scott R.

    2018-07-01

    The Asha pluton, the oldest unit of the San Blas intrusive complex (Early Carboniferous), exhibits impressive examples of magmatic structures formed by accumulation of K-feldspar megacrysts, enclaves, and schlieren. Almost all recognized structures are meter-scale, vertically elongate bodies of variable shapes defined as fingers, trails, drips, and blobs. They preferentially developed near the external margin of the Asha pluton and generally are superimposed by chamber-wide magmatic fabrics. They mostly have circular or sub-circular transverse sections with an internal fabric defined by margin-parallel, inward-dipping concentric foliation and steeply plunging lineation at upper parts and flat foliation at lower parts. The concentration of megacrysts usually grades from upper sections, where they appear in a proportion similar to the host granite, to highly packed accumulations of K-feldspar along with grouped flattened enclaves at lower ends. These features suggest an origin by downward localized multiphase magmatic flow, narrowing and 'log jamming', and gravitational sinking of grouped crystals and enclaves, with compaction and filter pressing as main mechanisms of melt removal. Crystal size distribution analysis supports field observations arguing for a mechanical origin of accumulations. The magmatic structures of the Asha pluton represent mechanical instabilities generated by thermal and compositional convection, probably owing to cooling and crystallization near the pluton margins during early stages of construction of the intrusive complex.

  12. Dosimetric treatment course simulation based on a statistical model of deformable organ motion

    NASA Astrophysics Data System (ADS)

    Söhn, M.; Sobotta, B.; Alber, M.

    2012-06-01

    We present a method of modeling dosimetric consequences of organ deformation and correlated motion of adjacent organ structures in radiotherapy. Based on a few organ geometry samples and the respective deformation fields as determined by deformable registration, principal component analysis (PCA) is used to create a low-dimensional parametric statistical organ deformation model (Söhn et al 2005 Phys. Med. Biol. 50 5893-908). PCA determines the most important geometric variability in terms of eigenmodes, which represent 3D vector fields of correlated organ deformations around the mean geometry. Weighted sums of a few dominating eigenmodes can be used to simulate synthetic geometries, which are statistically meaningful inter- and extrapolations of the input geometries, and predict their probability of occurrence. We present the use of PCA as a versatile treatment simulation tool, which allows comprehensive dosimetric assessment of the detrimental effects that deformable geometric uncertainties can have on a planned dose distribution. For this, a set of random synthetic geometries is generated by a PCA model for each simulated treatment course, and the dose of a given treatment plan is accumulated in the moving tissue elements via dose warping. This enables the calculation of average voxel doses, local dose variability, dose-volume histogram uncertainties, marginal as well as joint probability distributions of organ equivalent uniform doses and thus of TCP and NTCP, and other dosimetric and biologic endpoints. The method is applied to the example of deformable motion of prostate/bladder/rectum in prostate IMRT. Applications include dosimetric assessment of the adequacy of margin recipes, adaptation schemes, etc, as well as prospective ‘virtual’ evaluation of the possible benefits of new radiotherapy schemes.

  13. Dosimetric treatment course simulation based on a statistical model of deformable organ motion.

    PubMed

    Söhn, M; Sobotta, B; Alber, M

    2012-06-21

    We present a method of modeling dosimetric consequences of organ deformation and correlated motion of adjacent organ structures in radiotherapy. Based on a few organ geometry samples and the respective deformation fields as determined by deformable registration, principal component analysis (PCA) is used to create a low-dimensional parametric statistical organ deformation model (Söhn et al 2005 Phys. Med. Biol. 50 5893-908). PCA determines the most important geometric variability in terms of eigenmodes, which represent 3D vector fields of correlated organ deformations around the mean geometry. Weighted sums of a few dominating eigenmodes can be used to simulate synthetic geometries, which are statistically meaningful inter- and extrapolations of the input geometries, and predict their probability of occurrence. We present the use of PCA as a versatile treatment simulation tool, which allows comprehensive dosimetric assessment of the detrimental effects that deformable geometric uncertainties can have on a planned dose distribution. For this, a set of random synthetic geometries is generated by a PCA model for each simulated treatment course, and the dose of a given treatment plan is accumulated in the moving tissue elements via dose warping. This enables the calculation of average voxel doses, local dose variability, dose-volume histogram uncertainties, marginal as well as joint probability distributions of organ equivalent uniform doses and thus of TCP and NTCP, and other dosimetric and biologic endpoints. The method is applied to the example of deformable motion of prostate/bladder/rectum in prostate IMRT. Applications include dosimetric assessment of the adequacy of margin recipes, adaptation schemes, etc, as well as prospective 'virtual' evaluation of the possible benefits of new radiotherapy schemes.

  14. Formation and evolution of magma-poor margins, an example of the West Iberia margin

    NASA Astrophysics Data System (ADS)

    Perez-Gussinye, Marta; Andres-Martinez, Miguel; Morgan, Jason P.; Ranero, Cesar R.; Reston, Tim

    2016-04-01

    The West Iberia-Newfoundland (WIM-NF) conjugate margins have been geophysically and geologically surveyed for the last 30 years and have arguably become a paradigm for magma-poor extensional margins. Here we present a coherent picture of the WIM-NF rift to drift evolution that emerges from these observations and numerical modeling, and point out important differences that may exist with other magma-poor margins world-wide. The WIM-NF is characterized by a continental crust that thins asymmetrically and a wide and symmetric continent-ocean transition (COT) interpreted to consist of exhumed and serpentinised mantle with magmatic products increasing oceanward. The architectural evolution of these margins is mainly dominated by cooling under very slow extension velocities (<~6 mm/yr half-rate) and a lower crust that most probably was not extremely weak at the start of rifting. These conditions lead to a system where initially deformation is distributed over a broad area and the upper, lower crust and lithosphere are decoupled. As extension progresses upper, lower, crust and mantle become tightly coupled and deformation localizes due to strengthening and cooling during rifting. Coupling leads to asymmetric asthenospheric uplift and weakening of the hanginwall of the active fault, where a new fault forms. This continued process leads to the formation of an array of sequential faults that dip and become younger oceanward. Here we show that these processes acting in concert: 1) reproduce the margin asymmetry observed at the WIM-NF, 2) explain the fault geometry evolution from planar, to listric to detachment like by having one common Andersonian framework, 3) lead to the symmetric exhumation of mantle with little magmatism, and 4) explain the younging of the syn-rift towards the basin centre and imply that unconformities separating syn- and post-rift may be diachronous and younger towards the ocean. Finally, we show that different lower crustal rheologies lead to different patterns of extension and to an abrupt transition to oceanic crust, even at magma-poor margins.

  15. Healthy Eating and Leisure-Time Activity: Cross-Sectional Analysis of that Role of Work Environments in the U.S.

    PubMed

    Williams, Jessica A R; Arcaya, Mariana; Subramanian, S V

    2017-11-01

    The aim of this study was to evaluate relationships between work context and two health behaviors, healthy eating and leisure-time physical activity (LTPA), in U.S. adults. Using data from the 2010 National Health Interview Survey (NHIS) and Occupational Information Network (N = 14,863), we estimated a regression model to predict the marginal and joint probabilities of healthy eating and adhering to recommended exercise guidelines. Decision-making freedom was positively related to healthy eating and both behaviors jointly. Higher physical load was associated with a lower marginal probability of LTPA, healthy eating, and both behaviors jointly. Smoke and vapor exposures were negatively related to healthy eating and both behaviors. Chemical exposure was positively related to LTPA and both behaviors. Characteristics associated with marginal probabilities were not always predictive of joint outcomes. On the basis of nationwide occupation-specific evidence, workplace characteristics are important for healthy eating and LTPA.

  16. Risk and utility in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Cohen, Morrel H.; Natoli, Vincent D.

    2003-06-01

    Modern portfolio theory (MPT) addresses the problem of determining the optimum allocation of investment resources among a set of candidate assets. In the original mean-variance approach of Markowitz, volatility is taken as a proxy for risk, conflating uncertainty with risk. There have been many subsequent attempts to alleviate that weakness which, typically, combine utility and risk. We present here a modification of MPT based on the inclusion of separate risk and utility criteria. We define risk as the probability of failure to meet a pre-established investment goal. We define utility as the expectation of a utility function with positive and decreasing marginal value as a function of yield. The emphasis throughout is on long investment horizons for which risk-free assets do not exist. Analytic results are presented for a Gaussian probability distribution. Risk-utility relations are explored via empirical stock-price data, and an illustrative portfolio is optimized using the empirical data.

  17. Two proposed convergence criteria for Monte Carlo solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, R.A.; Pederson, S.P.; Booth, T.E.

    1992-01-01

    The central limit theorem (CLT) can be applied to a Monte Carlo solution if two requirements are satisfied: (1) The random variable has a finite mean and a finite variance; and (2) the number N of independent observations grows large. When these two conditions are satisfied, a confidence interval (CI) based on the normal distribution with a specified coverage probability can be formed. The first requirement is generally satisfied by the knowledge of the Monte Carlo tally being used. The Monte Carlo practitioner has a limited number of marginal methods to assess the fulfillment of the second requirement, such asmore » statistical error reduction proportional to 1/[radical]N with error magnitude guidelines. Two proposed methods are discussed in this paper to assist in deciding if N is large enough: estimating the relative variance of the variance (VOV) and examining the empirical history score probability density function (pdf).« less

  18. Impact of communities, health, and emotional-related factors on smoking use: comparison of joint modeling of mean and dispersion and Bayes' hierarchical models on add health survey.

    PubMed

    Pu, Jie; Fang, Di; Wilson, Jeffrey R

    2017-02-03

    The analysis of correlated binary data is commonly addressed through the use of conditional models with random effects included in the systematic component as opposed to generalized estimating equations (GEE) models that addressed the random component. Since the joint distribution of the observations is usually unknown, the conditional distribution is a natural approach. Our objective was to compare the fit of different binary models for correlated data in Tabaco use. We advocate that the joint modeling of the mean and dispersion may be at times just as adequate. We assessed the ability of these models to account for the intraclass correlation. In so doing, we concentrated on fitting logistic regression models to address smoking behaviors. Frequentist and Bayes' hierarchical models were used to predict conditional probabilities, and the joint modeling (GLM and GAM) models were used to predict marginal probabilities. These models were fitted to National Longitudinal Study of Adolescent to Adult Health (Add Health) data for Tabaco use. We found that people were less likely to smoke if they had higher income, high school or higher education and religious. Individuals were more likely to smoke if they had abused drug or alcohol, spent more time on TV and video games, and been arrested. Moreover, individuals who drank alcohol early in life were more likely to be a regular smoker. Children who experienced mistreatment from their parents were more likely to use Tabaco regularly. The joint modeling of the mean and dispersion models offered a flexible and meaningful method of addressing the intraclass correlation. They do not require one to identify random effects nor distinguish from one level of the hierarchy to the other. Moreover, once one can identify the significant random effects, one can obtain similar results to the random coefficient models. We found that the set of marginal models accounting for extravariation through the additional dispersion submodel produced similar results with regards to inferences and predictions. Moreover, both marginal and conditional models demonstrated similar predictive power.

  19. To accrete or not accrete, that is the question

    USGS Publications Warehouse

    von Huene, Roland E.

    1986-01-01

    Along modern convergent margins tectonic processes span a spectrum from accretion to erosion. The process of accretion is generally recognized because it leaves a geologic record, whereas the process of erosion is generally hypothetical because it produces a geologic hiatus. Major conditions that determine the dominance of accretion or erosion at modern convergent margins are: 1) rate and direction of plate convergence, 2) sediment supply and type in the trench, and 3) topography of the subducting ocean floor. Most change in structure has been ascribed to plate motion, but both erosion and accretion are observed along the same convergence margin. Thus sediment supply and topography are probably of equivalent importance to plate motion because both erosion and accretion are observed under constant conditions of plate convergence. The dominance of accretion or erosion at a margin varies with the thickness of trench sediment. In a sediment flooded trench, the proportions of subducted and accreted sediment are commonly established by the position of a decollement along a weak horizon in the sediment section. Thus, the vertical variation of sediment strength and the distribution of horizontal stress are important factors. Once deformation begins, the original sediment strength is decreased by sediment remolding and where sediment thickens rapidly, increases in pore fluid pressure can be pronounced. In sediment-starved trenches, where the relief of the subducting ocean floor is not smoothed over, the front of the margin must respond to the topography subducted as well as that accreted. The hypothesized erosion by the drag of positive features against the underside of the upper plate (a high stress environment) may alternate with erosion due to the collapse of a margin front into voids such as graben (a low stress environment). ?? 1986 Ferdinand Enke Verlag Stuttgart.

  20. The Effect of Velocity Correlation on the Spatial Evolution of Breakthrough Curves in Heterogeneous Media

    NASA Astrophysics Data System (ADS)

    Massoudieh, A.; Dentz, M.; Le Borgne, T.

    2017-12-01

    In heterogeneous media, the velocity distribution and the spatial correlation structure of velocity for solute particles determine the breakthrough curves and how they evolve as one moves away from the solute source. The ability to predict such evolution can help relating the spatio-statistical hydraulic properties of the media to the transport behavior and travel time distributions. While commonly used non-local transport models such as anomalous dispersion and classical continuous time random walk (CTRW) can reproduce breakthrough curve successfully by adjusting the model parameter values, they lack the ability to relate model parameters to the spatio-statistical properties of the media. This in turns limits the transferability of these models. In the research to be presented, we express concentration or flux of solutes as a distribution over their velocity. We then derive an integrodifferential equation that governs the evolution of the particle distribution over velocity at given times and locations for a particle ensemble, based on a presumed velocity correlation structure and an ergodic cross-sectional velocity distribution. This way, the spatial evolution of breakthrough curves away from the source is predicted based on cross-sectional velocity distribution and the connectivity, which is expressed by the velocity transition probability density. The transition probability is specified via a copula function that can help construct a joint distribution with a given correlation and given marginal velocities. Using this approach, we analyze the breakthrough curves depending on the velocity distribution and correlation properties. The model shows how the solute transport behavior evolves from ballistic transport at small spatial scales to Fickian dispersion at large length scales relative to the velocity correlation length.

  1. A comparison of exact tests for trend with binary endpoints using Bartholomew's statistic.

    PubMed

    Consiglio, J D; Shan, G; Wilding, G E

    2014-01-01

    Tests for trend are important in a number of scientific fields when trends associated with binary variables are of interest. Implementing the standard Cochran-Armitage trend test requires an arbitrary choice of scores assigned to represent the grouping variable. Bartholomew proposed a test for qualitatively ordered samples using asymptotic critical values, but type I error control can be problematic in finite samples. To our knowledge, use of the exact probability distribution has not been explored, and we study its use in the present paper. Specifically we consider an approach based on conditioning on both sets of marginal totals and three unconditional approaches where only the marginal totals corresponding to the group sample sizes are treated as fixed. While slightly conservative, all four tests are guaranteed to have actual type I error rates below the nominal level. The unconditional tests are found to exhibit far less conservatism than the conditional test and thereby gain a power advantage.

  2. Probabilistic treatment of the uncertainty from the finite size of weighted Monte Carlo data

    NASA Astrophysics Data System (ADS)

    Glüsenkamp, Thorsten

    2018-06-01

    Parameter estimation in HEP experiments often involves Monte Carlo simulation to model the experimental response function. A typical application are forward-folding likelihood analyses with re-weighting, or time-consuming minimization schemes with a new simulation set for each parameter value. Problematically, the finite size of such Monte Carlo samples carries intrinsic uncertainty that can lead to a substantial bias in parameter estimation if it is neglected and the sample size is small. We introduce a probabilistic treatment of this problem by replacing the usual likelihood functions with novel generalized probability distributions that incorporate the finite statistics via suitable marginalization. These new PDFs are analytic, and can be used to replace the Poisson, multinomial, and sample-based unbinned likelihoods, which covers many use cases in high-energy physics. In the limit of infinite statistics, they reduce to the respective standard probability distributions. In the general case of arbitrary Monte Carlo weights, the expressions involve the fourth Lauricella function FD, for which we find a new finite-sum representation in a certain parameter setting. The result also represents an exact form for Carlson's Dirichlet average Rn with n > 0, and thereby an efficient way to calculate the probability generating function of the Dirichlet-multinomial distribution, the extended divided difference of a monomial, or arbitrary moments of univariate B-splines. We demonstrate the bias reduction of our approach with a typical toy Monte Carlo problem, estimating the normalization of a peak in a falling energy spectrum, and compare the results with previously published methods from the literature.

  3. Stochastic static fault slip inversion from geodetic data with non-negativity and bound constraints

    NASA Astrophysics Data System (ADS)

    Nocquet, J.-M.

    2018-07-01

    Despite surface displacements observed by geodesy are linear combinations of slip at faults in an elastic medium, determining the spatial distribution of fault slip remains a ill-posed inverse problem. A widely used approach to circumvent the illness of the inversion is to add regularization constraints in terms of smoothing and/or damping so that the linear system becomes invertible. However, the choice of regularization parameters is often arbitrary, and sometimes leads to significantly different results. Furthermore, the resolution analysis is usually empirical and cannot be made independently of the regularization. The stochastic approach of inverse problems provides a rigorous framework where the a priori information about the searched parameters is combined with the observations in order to derive posterior probabilities of the unkown parameters. Here, I investigate an approach where the prior probability density function (pdf) is a multivariate Gaussian function, with single truncation to impose positivity of slip or double truncation to impose positivity and upper bounds on slip for interseismic modelling. I show that the joint posterior pdf is similar to the linear untruncated Gaussian case and can be expressed as a truncated multivariate normal (TMVN) distribution. The TMVN form can then be used to obtain semi-analytical formulae for the single, 2-D or n-D marginal pdf. The semi-analytical formula involves the product of a Gaussian by an integral term that can be evaluated using recent developments in TMVN probabilities calculations. Posterior mean and covariance can also be efficiently derived. I show that the maximum posterior (MAP) can be obtained using a non-negative least-squares algorithm for the single truncated case or using the bounded-variable least-squares algorithm for the double truncated case. I show that the case of independent uniform priors can be approximated using TMVN. The numerical equivalence to Bayesian inversions using Monte Carlo Markov chain (MCMC) sampling is shown for a synthetic example and a real case for interseismic modelling in Central Peru. The TMVN method overcomes several limitations of the Bayesian approach using MCMC sampling. First, the need of computer power is largely reduced. Second, unlike Bayesian MCMC-based approach, marginal pdf, mean, variance or covariance are obtained independently one from each other. Third, the probability and cumulative density functions can be obtained with any density of points. Finally, determining the MAP is extremely fast.

  4. Structured Set Intra Prediction With Discriminative Learning in a Max-Margin Markov Network for High Efficiency Video Coding

    PubMed Central

    Dai, Wenrui; Xiong, Hongkai; Jiang, Xiaoqian; Chen, Chang Wen

    2014-01-01

    This paper proposes a novel model on intra coding for High Efficiency Video Coding (HEVC), which simultaneously predicts blocks of pixels with optimal rate distortion. It utilizes the spatial statistical correlation for the optimal prediction based on 2-D contexts, in addition to formulating the data-driven structural interdependences to make the prediction error coherent with the probability distribution, which is desirable for successful transform and coding. The structured set prediction model incorporates a max-margin Markov network (M3N) to regulate and optimize multiple block predictions. The model parameters are learned by discriminating the actual pixel value from other possible estimates to maximize the margin (i.e., decision boundary bandwidth). Compared to existing methods that focus on minimizing prediction error, the M3N-based model adaptively maintains the coherence for a set of predictions. Specifically, the proposed model concurrently optimizes a set of predictions by associating the loss for individual blocks to the joint distribution of succeeding discrete cosine transform coefficients. When the sample size grows, the prediction error is asymptotically upper bounded by the training error under the decomposable loss function. As an internal step, we optimize the underlying Markov network structure to find states that achieve the maximal energy using expectation propagation. For validation, we integrate the proposed model into HEVC for optimal mode selection on rate-distortion optimization. The proposed prediction model obtains up to 2.85% bit rate reduction and achieves better visual quality in comparison to the HEVC intra coding. PMID:25505829

  5. Historical and future drought in Bangladesh using copula-based bivariate regional frequency analysis

    NASA Astrophysics Data System (ADS)

    Mortuza, Md Rubayet; Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2018-02-01

    The study aims at regional and probabilistic evaluation of bivariate drought characteristics to assess both the past and future drought duration and severity in Bangladesh. The procedures involve applying (1) standardized precipitation index to identify drought duration and severity, (2) regional frequency analysis to determine the appropriate marginal distributions for both duration and severity, (3) copula model to estimate the joint probability distribution of drought duration and severity, and (4) precipitation projections from multiple climate models to assess future drought trends. Since drought duration and severity in Bangladesh are often strongly correlated and do not follow same marginal distributions, the joint and conditional return periods of droughts are characterized using the copula-based joint distribution. The country is divided into three homogeneous regions using Fuzzy clustering and multivariate discordancy and homogeneity measures. For given severity and duration values, the joint return periods for a drought to exceed both values are on average 45% larger, while to exceed either value are 40% less than the return periods from the univariate frequency analysis, which treats drought duration and severity independently. These suggest that compared to the bivariate drought frequency analysis, the standard univariate frequency analysis under/overestimate the frequency and severity of droughts depending on how their duration and severity are related. Overall, more frequent and severe droughts are observed in the west side of the country. Future drought trend based on four climate models and two scenarios showed the possibility of less frequent drought in the future (2020-2100) than in the past (1961-2010).

  6. Characterizing the Lyman-alpha forest flux probability distribution function using Legendre polynomials

    NASA Astrophysics Data System (ADS)

    Cieplak, Agnieszka; Slosar, Anze

    2018-01-01

    The Lyman-alpha forest has become a powerful cosmological probe at intermediate redshift. It is a highly non-linear field with much information present beyond the power spectrum. The flux probability flux distribution (PDF) in particular has been a successful probe of small scale physics. However, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring the coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. Since the n-th Legendre coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. Additionally, in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a small amount of well-measured quantities. Finally, we find that measuring fewer quasars with high signal-to-noise produces a higher amount of recoverable information.

  7. Household expansion linked to agricultural intensification during emergence of Hawaiian archaic states

    PubMed Central

    Field, Julie S.; Ladefoged, Thegn N.; Kirch, Patrick V.

    2011-01-01

    The Leeward Kohala Field System (LKFS) covering ∼60 km2 on Hawai‘i Island is one of the world's best-studied archaeological examples of preindustrial agricultural intensification. Archaeological correlates for households over a 400-y period of intensification of the LKFS (A.D. 1400–1800) indicate that household age, number, and distribution closely match the expansion of agricultural features at both macro- and microscales. We excavated and dated residential complexes within portions of five traditional Hawaiian land units (ahupua‘a), two in the central core of the field system and three in the southern margins. Forty-eight radiocarbon dates from 43 residential features indicate an overall pattern of exponential increase in the numbers of households over time. Spatial distribution of these dates suggests that the core of the LKFS may have reached a population saturation point earlier than in the southern margins. Bayesian statistical analysis of radiocarbon dates from residential features in the core region, combined with spatial analysis of agricultural and residential construction sequences, demonstrates that the progressive subdivision of territories into smaller socioeconomic units was matched by addition of new residences, probably through a process of household fissioning. These results provide insights into the economic processes underlying the sociopolitical transformation from chiefdom to archaic state in precontact Hawai‘i. PMID:21502516

  8. Eolian intracrater deposits on Mars - Physical properties and global distribution

    NASA Technical Reports Server (NTRS)

    Christensen, P. R.

    1983-01-01

    It is noted that more than one-fourth of all craters larger than 25 km in diameter between -50 deg S and 50 deg N have localized deposits of coarse material on the floor which are associated with the dark 'splotches' that are seen visually. If homogeneous, unconsolidated materials are assumed, the measured thermal inertias of these deposits imply effective grain sizes that range from 0.1 mm to 1 cm, with a modal value of 0.9 mm. Even though these deposits are coarser and darker than the surrounding terrains and the greater part of the Martian surface, they are not compositionally distinct from materials with similar albedos. It is thought most likely that these features were formed by entrapment of marginally mobile material that can be transported into, but not out of, crater depressions by the wind. Most of the 'splotch' deposits are coarser than the dune-forming materials occurring in the north polar region and inside extreme southern latitude craters; they probably form low, broad zibar dunes or lag deposits. The distribution of intracrater deposits is seen as suggesting that the intracrater features have been buried in the interior of Arabia and that the dust deposit is less extensive at the margins and may currently be expanding.

  9. [Influence of different designs of marginal preparation on stress distribution in the mandibular premolar restored with endocrown].

    PubMed

    Guo, Jing; Wang, Xiao-Yu; Li, Xue-Sheng; Sun, Hai-Yang; Liu, Lin; Li, Hong-Bo

    2016-02-01

    To evaluate the effect of different designs of marginal preparation on stress distribution in the mandibular premolar restored with endocrown using three-dimensional finite element method. Four models with different designs of marginal preparation, including the flat margin, 90° shoulder, 135° shoulder and chamfer shoulder, were established to imitate mandibular first premolar restored with endocrown. A load of 100 N was applied to the intersection of the long axis and the occlusal surface, either parallel or with an angle of 45° to the long axis of the tooth. The maximum values of Von Mises stress and the stress distribution around the cervical region of the abutment and the endocrown with different designs of marginal preparation were analyzed. The load parallel to the long axis of the tooth caused obvious stress concentration in the lingual portions of both the cervical region of the tooth tissue and the restoration. The stress distribution characteristics on the cervical region of the models with a flat margin and a 90° shoulder were more uniform than those in the models with a 135° shoulder and chamfer shoulder. Loading at 45° to the long axis caused stress concentration mainly on the buccal portion of the cervical region, and the model with a flat margin showed the most favorable stress distribution patterns with a greater maximum Von Mises stress under this circumstance than that with a parallel loading. Irrespective of the loading direction, the stress value was the lowest in the flat margin model, where the stress value in the cervical region of the endocrown was greater than that in the counterpart of the tooth tissue. The stress level on the enamel was higher than that on the dentin nearby in the flat margin model. From the stress distribution point of view, endocrowns with flat margin followed by a 90° shoulder are recommended.

  10. SU-D-16A-04: Accuracy of Treatment Plan TCP and NTCP Values as Determined Via Treatment Course Delivery Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebers, J; Xu, H; Gordon, J

    2014-06-01

    Purpose: To to determine if tumor control probability (TCP) and normal tissue control probability (NTCP) values computed on the treatment planning image are representative of TCP/NTCP distributions resulting from probable positioning variations encountered during external-beam radiotherapy. Methods: We compare TCP/NTCP as typically computed on the planning PTV/OARs with distributions of those parameters computed for CTV/OARs via treatment delivery simulations which include the effect of patient organ deformations for a group of 19 prostate IMRT pseudocases. Planning objectives specified 78 Gy to PTV1=prostate CTV+5 mm margin, 66 Gy to PTV2=seminal vesicles+8 mm margin, and multiple bladder/rectum OAR objectives to achieve typicalmore » clinical OAR sparing. TCP were computed using the Poisson Model while NTCPs used the Lyman-Kutcher-Bruman model. For each patient, 1000 30-fraction virtual treatment courses were simulated with each fractional pseudo- time-oftreatment anatomy sampled from a principle component analysis patient deformation model. Dose for each virtual treatment-course was determined via deformable summation of dose from the individual fractions. CTVTCP/ OAR-NTCP values were computed for each treatment course, statistically analyzed, and compared with the planning PTV-TCP/OARNTCP values. Results: Mean TCP from the simulations differed by <1% from planned TCP for 18/19 patients; 1/19 differed by 1.7%. Mean bladder NTCP differed from the planned NTCP by >5% for 12/19 patients and >10% for 4/19 patients. Similarly, mean rectum NTCP differed by >5% for 12/19 patients, >10% for 4/19 patients. Both mean bladder and mean rectum NTCP differed by >5% for 10/19 patients and by >10% for 2/19 patients. For several patients, planned NTCP was less than the minimum or more than the maximum from the treatment course simulations. Conclusion: Treatment course simulations yield TCP values that are similar to planned values, while OAR NTCPs differ significantly, indicating the need for probabilistic methods or PRVs for OAR risk assessment. Presenting author receives support from Philips Medical Systems.« less

  11. Joint distribution of temperature and precipitation in the Mediterranean, using the Copula method

    NASA Astrophysics Data System (ADS)

    Lazoglou, Georgia; Anagnostopoulou, Christina

    2018-03-01

    This study analyses the temperature and precipitation dependence among stations in the Mediterranean. The first station group is located in the eastern Mediterranean (EM) and includes two stations, Athens and Thessaloniki, while the western (WM) one includes Malaga and Barcelona. The data was organized in two time periods, the hot-dry period and the cold-wet one, composed of 5 months, respectively. The analysis is based on a new statistical technique in climatology: the Copula method. Firstly, the calculation of the Kendall tau correlation index showed that temperatures among stations are dependant during both time periods whereas precipitation presents dependency only between the stations located in EM or WM and only during the cold-wet period. Accordingly, the marginal distributions were calculated for each studied station, as they are further used by the copula method. Finally, several copula families, both Archimedean and Elliptical, were tested in order to choose the most appropriate one to model the relation of the studied data sets. Consequently, this study achieves to model the dependence of the main climate parameters (temperature and precipitation) with the Copula method. The Frank copula was identified as the best family to describe the joint distribution of temperature, for the majority of station groups. For precipitation, the best copula families are BB1 and Survival Gumbel. Using the probability distribution diagrams, the probability of a combination of temperature and precipitation values between stations is estimated.

  12. The PIT-trap-A "model-free" bootstrap procedure for inference about regression models with discrete, multivariate responses.

    PubMed

    Warton, David I; Thibaut, Loïc; Wang, Yi Alice

    2017-01-01

    Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.

  13. The PIT-trap—A “model-free” bootstrap procedure for inference about regression models with discrete, multivariate responses

    PubMed Central

    Thibaut, Loïc; Wang, Yi Alice

    2017-01-01

    Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071

  14. Logistic regression of family data from retrospective study designs.

    PubMed

    Whittemore, Alice S; Halpern, Jerry

    2003-11-01

    We wish to study the effects of genetic and environmental factors on disease risk, using data from families ascertained because they contain multiple cases of the disease. To do so, we must account for the way participants were ascertained, and for within-family correlations in both disease occurrences and covariates. We model the joint probability distribution of the covariates of ascertained family members, given family disease occurrence and pedigree structure. We describe two such covariate models: the random effects model and the marginal model. Both models assume a logistic form for the distribution of one person's covariates that involves a vector beta of regression parameters. The components of beta in the two models have different interpretations, and they differ in magnitude when the covariates are correlated within families. We describe ascertainment assumptions needed to estimate consistently the parameters beta(RE) in the random effects model and the parameters beta(M) in the marginal model. Under the ascertainment assumptions for the random effects model, we show that conditional logistic regression (CLR) of matched family data gives a consistent estimate beta(RE) for beta(RE) and a consistent estimate for the covariance matrix of beta(RE). Under the ascertainment assumptions for the marginal model, we show that unconditional logistic regression (ULR) gives a consistent estimate for beta(M), and we give a consistent estimator for its covariance matrix. The random effects/CLR approach is simple to use and to interpret, but it can use data only from families containing both affected and unaffected members. The marginal/ULR approach uses data from all individuals, but its variance estimates require special computations. A C program to compute these variance estimates is available at http://www.stanford.edu/dept/HRP/epidemiology. We illustrate these pros and cons by application to data on the effects of parity on ovarian cancer risk in mother/daughter pairs, and use simulations to study the performance of the estimates. Copyright 2003 Wiley-Liss, Inc.

  15. Late Quaternary uplift rate across the Shimokita peninsula, northeastern Japan forearc

    NASA Astrophysics Data System (ADS)

    Matsu'Ura, T.

    2009-12-01

    I estimated the late Quaternary uplift rate across the northeastern Japan forearc (Shimokita peninsula) by using the height distribution of MIS 5.5 marine terraces as determined from tephra and cryptotephra stratigraphy. The heights of inner-margins (shoreline angles) of the MIS 5.5 marine terrace surface were previously reported to be 43-45 m and 30 m around Shiriyazaki and Gamanosawa, respectively. These heights decrease westward and are possibly due to a west-dipping offshore fault. But in some places, the heights of terrace inner-margins are probably overestimated by thick sediments. I found the MIS 5.5 wave-cut platform which is overlain by gravels and loess deposits containing a basal Toya tephra horizon (MIS 5.4) at Shiriyazaki by boring. The MIS 5.5 wave-cut platform (paleo sea level) is about 25 m above sea level, nearly half of the reported height of the terrace inner-margin. My result shows that the late Quaternary uplift rate across the Shimokita peninsula should be reconsidered. Further studies are also required whether or not the intra-plate (offshore) fault is a factor of the forearc uplifting at the peninsula. This research project has been conducted under the research contract with Nuclear and Industrial Safety Agency (NISA).

  16. A maximum entropy thermodynamics of small systems.

    PubMed

    Dixit, Purushottam D

    2013-05-14

    We present a maximum entropy approach to analyze the state space of a small system in contact with a large bath, e.g., a solvated macromolecular system. For the solute, the fluctuations around the mean values of observables are not negligible and the probability distribution P(r) of the state space depends on the intricate details of the interaction of the solute with the solvent. Here, we employ a superstatistical approach: P(r) is expressed as a marginal distribution summed over the variation in β, the inverse temperature of the solute. The joint distribution P(β, r) is estimated by maximizing its entropy. We also calculate the first order system-size corrections to the canonical ensemble description of the state space. We test the development on a simple harmonic oscillator interacting with two baths with very different chemical identities, viz., (a) Lennard-Jones particles and (b) water molecules. In both cases, our method captures the state space of the oscillator sufficiently well. Future directions and connections with traditional statistical mechanics are discussed.

  17. Distribution of the two-sample t-test statistic following blinded sample size re-estimation.

    PubMed

    Lu, Kaifeng

    2016-05-01

    We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Improving experimental phases for strong reflections prior to density modification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uervirojnangkoorn, Monarin; Hilgenfeld, Rolf; Terwilliger, Thomas C.

    Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005), Acta Cryst. D 61, 899–902], the impact of identifying optimized phases for a small numbermore » of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. Lastly, a computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less

  19. Improving experimental phases for strong reflections prior to density modification

    DOE PAGES

    Uervirojnangkoorn, Monarin; Hilgenfeld, Rolf; Terwilliger, Thomas C.; ...

    2013-09-20

    Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005), Acta Cryst. D 61, 899–902], the impact of identifying optimized phases for a small numbermore » of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. Lastly, a computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less

  20. To kill a kangaroo: understanding the decision to pursue high-risk/high-gain resources.

    PubMed

    Jones, James Holland; Bird, Rebecca Bliege; Bird, Douglas W

    2013-09-22

    In this paper, we attempt to understand hunter-gatherer foraging decisions about prey that vary in both the mean and variance of energy return using an expected utility framework. We show that for skewed distributions of energetic returns, the standard linear variance discounting (LVD) model for risk-sensitive foraging can produce quite misleading results. In addition to creating difficulties for the LVD model, the skewed distributions characteristic of hunting returns create challenges for estimating probability distribution functions required for expected utility. We present a solution using a two-component finite mixture model for foraging returns. We then use detailed foraging returns data based on focal follows of individual hunters in Western Australia hunting for high-risk/high-gain (hill kangaroo) and relatively low-risk/low-gain (sand monitor) prey. Using probability densities for the two resources estimated from the mixture models, combined with theoretically sensible utility curves characterized by diminishing marginal utility for the highest returns, we find that the expected utility of the sand monitors greatly exceeds that of kangaroos despite the fact that the mean energy return for kangaroos is nearly twice as large as that for sand monitors. We conclude that the decision to hunt hill kangaroos does not arise simply as part of an energetic utility-maximization strategy and that additional social, political or symbolic benefits must accrue to hunters of this highly variable prey.

  1. Statistical analysis of mesoscale rainfall: Dependence of a random cascade generator on large-scale forcing

    NASA Technical Reports Server (NTRS)

    Over, Thomas, M.; Gupta, Vijay K.

    1994-01-01

    Under the theory of independent and identically distributed random cascades, the probability distribution of the cascade generator determines the spatial and the ensemble properties of spatial rainfall. Three sets of radar-derived rainfall data in space and time are analyzed to estimate the probability distribution of the generator. A detailed comparison between instantaneous scans of spatial rainfall and simulated cascades using the scaling properties of the marginal moments is carried out. This comparison highlights important similarities and differences between the data and the random cascade theory. Differences are quantified and measured for the three datasets. Evidence is presented to show that the scaling properties of the rainfall can be captured to the first order by a random cascade with a single parameter. The dependence of this parameter on forcing by the large-scale meteorological conditions, as measured by the large-scale spatial average rain rate, is investigated for these three datasets. The data show that this dependence can be captured by a one-to-one function. Since the large-scale average rain rate can be diagnosed from the large-scale dynamics, this relationship demonstrates an important linkage between the large-scale atmospheric dynamics and the statistical cascade theory of mesoscale rainfall. Potential application of this research to parameterization of runoff from the land surface and regional flood frequency analysis is briefly discussed, and open problems for further research are presented.

  2. Statistical physics of medical diagnostics: Study of a probabilistic model.

    PubMed

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  3. Statistical physics of medical diagnostics: Study of a probabilistic model

    NASA Astrophysics Data System (ADS)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  4. Modeling Spatial Dependence of Rainfall Extremes Across Multiple Durations

    NASA Astrophysics Data System (ADS)

    Le, Phuong Dong; Leonard, Michael; Westra, Seth

    2018-03-01

    Determining the probability of a flood event in a catchment given that another flood has occurred in a nearby catchment is useful in the design of infrastructure such as road networks that have multiple river crossings. These conditional flood probabilities can be estimated by calculating conditional probabilities of extreme rainfall and then transforming rainfall to runoff through a hydrologic model. Each catchment's hydrological response times are unlikely to be the same, so in order to estimate these conditional probabilities one must consider the dependence of extreme rainfall both across space and across critical storm durations. To represent these types of dependence, this study proposes a new approach for combining extreme rainfall across different durations within a spatial extreme value model using max-stable process theory. This is achieved in a stepwise manner. The first step defines a set of common parameters for the marginal distributions across multiple durations. The parameters are then spatially interpolated to develop a spatial field. Storm-level dependence is represented through the max-stable process for rainfall extremes across different durations. The dependence model shows a reasonable fit between the observed pairwise extremal coefficients and the theoretical pairwise extremal coefficient function across all durations. The study demonstrates how the approach can be applied to develop conditional maps of the return period and return level across different durations.

  5. Ensemble learning of inverse probability weights for marginal structural modeling in large observational datasets.

    PubMed

    Gruber, Susan; Logan, Roger W; Jarrín, Inmaculada; Monge, Susana; Hernán, Miguel A

    2015-01-15

    Inverse probability weights used to fit marginal structural models are typically estimated using logistic regression. However, a data-adaptive procedure may be able to better exploit information available in measured covariates. By combining predictions from multiple algorithms, ensemble learning offers an alternative to logistic regression modeling to further reduce bias in estimated marginal structural model parameters. We describe the application of two ensemble learning approaches to estimating stabilized weights: super learning (SL), an ensemble machine learning approach that relies on V-fold cross validation, and an ensemble learner (EL) that creates a single partition of the data into training and validation sets. Longitudinal data from two multicenter cohort studies in Spain (CoRIS and CoRIS-MD) were analyzed to estimate the mortality hazard ratio for initiation versus no initiation of combined antiretroviral therapy among HIV positive subjects. Both ensemble approaches produced hazard ratio estimates further away from the null, and with tighter confidence intervals, than logistic regression modeling. Computation time for EL was less than half that of SL. We conclude that ensemble learning using a library of diverse candidate algorithms offers an alternative to parametric modeling of inverse probability weights when fitting marginal structural models. With large datasets, EL provides a rich search over the solution space in less time than SL with comparable results. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Ensemble learning of inverse probability weights for marginal structural modeling in large observational datasets

    PubMed Central

    Gruber, Susan; Logan, Roger W.; Jarrín, Inmaculada; Monge, Susana; Hernán, Miguel A.

    2014-01-01

    Inverse probability weights used to fit marginal structural models are typically estimated using logistic regression. However a data-adaptive procedure may be able to better exploit information available in measured covariates. By combining predictions from multiple algorithms, ensemble learning offers an alternative to logistic regression modeling to further reduce bias in estimated marginal structural model parameters. We describe the application of two ensemble learning approaches to estimating stabilized weights: super learning (SL), an ensemble machine learning approach that relies on V -fold cross validation, and an ensemble learner (EL) that creates a single partition of the data into training and validation sets. Longitudinal data from two multicenter cohort studies in Spain (CoRIS and CoRIS-MD) were analyzed to estimate the mortality hazard ratio for initiation versus no initiation of combined antiretroviral therapy among HIV positive subjects. Both ensemble approaches produced hazard ratio estimates further away from the null, and with tighter confidence intervals, than logistic regression modeling. Computation time for EL was less than half that of SL. We conclude that ensemble learning using a library of diverse candidate algorithms offers an alternative to parametric modeling of inverse probability weights when fitting marginal structural models. With large datasets, EL provides a rich search over the solution space in less time than SL with comparable results. PMID:25316152

  7. General simulation algorithm for autocorrelated binary processes.

    PubMed

    Serinaldi, Francesco; Lombardo, Federico

    2017-02-01

    The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.

  8. Eigenvalue statistics for the sum of two complex Wishart matrices

    NASA Astrophysics Data System (ADS)

    Kumar, Santosh

    2014-09-01

    The sum of independent Wishart matrices, taken from distributions with unequal covariance matrices, plays a crucial role in multivariate statistics, and has applications in the fields of quantitative finance and telecommunication. However, analytical results concerning the corresponding eigenvalue statistics have remained unavailable, even for the sum of two Wishart matrices. This can be attributed to the complicated and rotationally noninvariant nature of the matrix distribution that makes extracting the information about eigenvalues a nontrivial task. Using a generalization of the Harish-Chandra-Itzykson-Zuber integral, we find exact solution to this problem for the complex Wishart case when one of the covariance matrices is proportional to the identity matrix, while the other is arbitrary. We derive exact and compact expressions for the joint probability density and marginal density of eigenvalues. The analytical results are compared with numerical simulations and we find perfect agreement.

  9. Simulated big sagebrush regeneration supports predicted changes at the trailing and leading edges of distribution shifts

    USGS Publications Warehouse

    Schlaepfer, Daniel R.; Taylor, Kyle A.; Pennington, Victoria E.; Nelson, Kellen N.; Martin, Trace E.; Rottler, Caitlin M.; Lauenroth, William K.; Bradford, John B.

    2015-01-01

    Many semi-arid plant communities in western North America are dominated by big sagebrush. These ecosystems are being reduced in extent and quality due to economic development, invasive species, and climate change. These pervasive modifications have generated concern about the long-term viability of sagebrush habitat and sagebrush-obligate wildlife species (notably greater sage-grouse), highlighting the need for better understanding of the future big sagebrush distribution, particularly at the species' range margins. These leading and trailing edges of potential climate-driven sagebrush distribution shifts are likely to be areas most sensitive to climate change. We used a process-based regeneration model for big sagebrush, which simulates potential germination and seedling survival in response to climatic and edaphic conditions and tested expectations about current and future regeneration responses at trailing and leading edges that were previously identified using traditional species distribution models. Our results confirmed expectations of increased probability of regeneration at the leading edge and decreased probability of regeneration at the trailing edge below current levels. Our simulations indicated that soil water dynamics at the leading edge became more similar to the typical seasonal ecohydrological conditions observed within the current range of big sagebrush ecosystems. At the trailing edge, an increased winter and spring dryness represented a departure from conditions typically supportive of big sagebrush. Our results highlighted that minimum and maximum daily temperatures as well as soil water recharge and summer dry periods are important constraints for big sagebrush regeneration. Overall, our results confirmed previous predictions, i.e., we see consistent changes in areas identified as trailing and leading edges; however, we also identified potential local refugia within the trailing edge, mostly at sites at higher elevation. Decreasing regeneration probability at the trailing edge underscores the Schlaepfer et al. Future regeneration potential of big sagebrush potential futility of efforts to preserve and/or restore big sagebrush in these areas. Conversely, increasing regeneration probability at the leading edge suggest a growing potential for conflicts in management goals between maintaining existing grasslands by preventing sagebrush expansion versus accepting a shift in plant community composition to sagebrush dominance.

  10. Workers on the margin: who drops health coverage when prices rise?

    PubMed

    Okeke, Edward N; Hirth, Richard A; Grazier, Kyle

    2010-01-01

    We revisit the question of price elasticity of employer-sponsored insurance (ESI) take-up by directly examining changes in the take-up of ESI at a large firm in response to exogenous changes in employee premium contributions. We find that, on average, a 10% increase in the employee's out-of-pocket premium increases the probability of dropping coverage by approximately 1%. More importantly, we find heterogeneous impacts: married workers are much more price-sensitive than single employees, and lower-paid workers are disproportionately more likely to drop coverage than higher-paid workers. Elasticity estimates for employees below the 25th percentile of salary distribution in our sample are nearly twice the average.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balderson, Michael, E-mail: michael.balderson@rmp.uhn.ca; Brown, Derek; Johnson, Patricia

    The purpose of this work was to compare static gantry intensity-modulated radiation therapy (IMRT) with volume-modulated arc therapy (VMAT) in terms of tumor control probability (TCP) under scenarios involving large geometric misses, i.e., those beyond what are accounted for when margin expansion is determined. Using a planning approach typical for these treatments, a linear-quadratic–based model for TCP was used to compare mean TCP values for a population of patients who experiences a geometric miss (i.e., systematic and random shifts of the clinical target volume within the planning target dose distribution). A Monte Carlo approach was used to account for themore » different biological sensitivities of a population of patients. Interestingly, for errors consisting of coplanar systematic target volume offsets and three-dimensional random offsets, static gantry IMRT appears to offer an advantage over VMAT in that larger shift errors are tolerated for the same mean TCP. For example, under the conditions simulated, erroneous systematic shifts of 15 mm directly between or directly into static gantry IMRT fields result in mean TCP values between 96% and 98%, whereas the same errors on VMAT plans result in mean TCP values between 45% and 74%. Random geometric shifts of the target volume were characterized using normal distributions in each Cartesian dimension. When the standard deviations were doubled from those values assumed in the derivation of the treatment margins, our model showed a 7% drop in mean TCP for the static gantry IMRT plans but a 20% drop in TCP for the VMAT plans. Although adding a margin for error to a clinical target volume is perhaps the best approach to account for expected geometric misses, this work suggests that static gantry IMRT may offer a treatment that is more tolerant to geometric miss errors than VMAT.« less

  12. A Bayesian-frequentist two-stage single-arm phase II clinical trial design.

    PubMed

    Dong, Gaohong; Shih, Weichung Joe; Moore, Dirk; Quan, Hui; Marcella, Stephen

    2012-08-30

    It is well-known that both frequentist and Bayesian clinical trial designs have their own advantages and disadvantages. To have better properties inherited from these two types of designs, we developed a Bayesian-frequentist two-stage single-arm phase II clinical trial design. This design allows both early acceptance and rejection of the null hypothesis ( H(0) ). The measures (for example probability of trial early termination, expected sample size, etc.) of the design properties under both frequentist and Bayesian settings are derived. Moreover, under the Bayesian setting, the upper and lower boundaries are determined with predictive probability of trial success outcome. Given a beta prior and a sample size for stage I, based on the marginal distribution of the responses at stage I, we derived Bayesian Type I and Type II error rates. By controlling both frequentist and Bayesian error rates, the Bayesian-frequentist two-stage design has special features compared with other two-stage designs. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Early Carboniferous (Tournasian-early Visean) global paleogeography, Paleostorm tracts, and the distribution of Waulsortian and Waulsortian-like carbonate mud mounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, D.T. Jr.

    1990-05-01

    Tournasian-early Visean mud mounds (i.e., Waulsortian and Waulsortian-like mounds) are unlike other carbonate buildups in the stratigraphic record because they lack an identifiable frame-building organism. Waulsortian mounds are comprised mainly of carbonate mud; Waulsortian-like mounds are mud-rich and contain a significant percent of skeletal grains, especially crinoids and bryozoa. This study has revealed that all of the reported Waulsortian and Waulsortian-like mounds developed in low paleolatitudes either on the southern shelf margin of the Laurussian paleocontinent or in the Laurussian interior seaway. Waulsortian and Waulsortian-like mounds are specifically not present in low-latitude regions of other paleocontinents. As Tournasian-early Visean carbonatemore » deposition was widespread in the range of 30{degree}N to 10{degree}S, the very restricted paleogeographic distribution of Waulsortian and Waulsortian-like mound locations suggests a mechanism or set of conditions that effectively limited the distribution of mud mounds. Considering the Tournasian-early Visean distribution of paleocontinents and the principles that govern the movement of modern hurricanes, tropical storms, and winter storms, the tracts of hurricanes, tropical storms, and winter storms probably crossed all main submerged paleocontinental areas except the southern Laurussian shelf margin and the Laurussian interior seaway, the two areas where mud mounds developed. The lack of storm energy in these two large areas of Laurussia provided long-term stability and thus enhanced the growth prospects of the frame-deficient Waulsortian and Waulsortian-like mud mounds. Lack of extensive periodic wave reworking and other storm-induced devastation helps to account for enigmatic features such as general mound symmetry, great size, high depositional relief (as much as 220 m), and side steepness (as steep as 50{degree}).« less

  14. WE-AB-209-08: Novel Beam-Specific Adaptive Margins for Reducing Organ-At-Risk Doses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsang, H; Kamerling, CP; Ziegenhein, P

    2016-06-15

    Purpose: Current practice of using 3D margins in radiotherapy with high-energy photon beams provides larger-than-required target coverage. According to the photon depth-dose curve, target displacements in beam direction result in minute changes in dose delivered. We exploit this behavior by generating margins on a per-beam basis which simultaneously account for the relative distance of the target and adjacent organs-at-risk (OARs). Methods: For each beam, we consider only geometrical uncertainties of the target location perpendicular to beam direction. By weighting voxels based on its proximity to an OAR, we generate adaptive margins that yield similar overall target coverage probability and reducedmore » OAR dose-burden, at the expense of increased target volume. Three IMRT plans, using 3D margins and 2D per-beam margins with and without adaptation, were generated for five prostate patients with a prescription dose Dpres of 78Gy in 2Gy fractions using identical optimisation constraints. Systematic uncertainties of 1.1, 1.1, 1.5mm in the LR, SI, and AP directions, respectively, and 0.9, 1.1, 1.0mm for the random uncertainties, were assumed. A verification tool was employed to simulate the effects of systematic and random errors using a population size of 50,000. The fraction of the population that satisfies or violates a given DVH constraint was used for comparison. Results: We observe similar target coverage across all plans, with at least 97.5% of the population meeting the D98%>95%Dpres constraint. When looking at the probability of the population receiving D5<70Gy for the rectum, we observed median absolute increases of 23.61% (range, 2.15%–27.85%) and 6.97% (range, 0.65%–17.76%) using per-beam margins with and without adaptation, respectively, relative to using 3D margins. Conclusion: We observed sufficient and similar target coverage using per-beam margins. By adapting each per-beam margin away from an OAR, we can further reduce OAR dose without significantly lowering target coverage probability by irradiating more less-important tissues. This work is supported by Cancer Research UK under Programme C33589/A19908. Research at ICR is also supported by Cancer Research UK under Programme C33589/A19727 and NHS funding to the NIHR Biomedical Research Centre at RMH and ICR.« less

  15. Robust versus consistent variance estimators in marginal structural Cox models.

    PubMed

    Enders, Dirk; Engel, Susanne; Linder, Roland; Pigeot, Iris

    2018-06-11

    In survival analyses, inverse-probability-of-treatment (IPT) and inverse-probability-of-censoring (IPC) weighted estimators of parameters in marginal structural Cox models are often used to estimate treatment effects in the presence of time-dependent confounding and censoring. In most applications, a robust variance estimator of the IPT and IPC weighted estimator is calculated leading to conservative confidence intervals. This estimator assumes that the weights are known rather than estimated from the data. Although a consistent estimator of the asymptotic variance of the IPT and IPC weighted estimator is generally available, applications and thus information on the performance of the consistent estimator are lacking. Reasons might be a cumbersome implementation in statistical software, which is further complicated by missing details on the variance formula. In this paper, we therefore provide a detailed derivation of the variance of the asymptotic distribution of the IPT and IPC weighted estimator and explicitly state the necessary terms to calculate a consistent estimator of this variance. We compare the performance of the robust and consistent variance estimators in an application based on routine health care data and in a simulation study. The simulation reveals no substantial differences between the 2 estimators in medium and large data sets with no unmeasured confounding, but the consistent variance estimator performs poorly in small samples or under unmeasured confounding, if the number of confounders is large. We thus conclude that the robust estimator is more appropriate for all practical purposes. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Relative source allocation of TDI to drinking water for derivation of a criterion for chloroform: a Monte-Carlo and multi-exposure assessment.

    PubMed

    Niizuma, Shun; Matsui, Yoshihiko; Ohno, Koichi; Itoh, Sadahiko; Matsushita, Taku; Shirasaki, Nobutaka

    2013-10-01

    Drinking water quality standard (DWQS) criteria for chemicals for which there is a threshold for toxicity are derived by allocating a fraction of tolerable daily intake (TDI) to exposure from drinking water. We conducted physiologically based pharmacokinetic model simulations for chloroform and have proposed an equation for total oral-equivalent potential intake via three routes (oral ingestion, inhalation, and dermal exposures), the biologically effective doses of which were converted to oral-equivalent potential intakes. The probability distributions of total oral-equivalent potential intake in Japanese people were estimated by Monte Carlo simulations. Even when the chloroform concentration in drinking water equaled the current DWQS criterion, there was sufficient margin between the intake and the TDI: the probability that the intake exceeded TDI was below 0.1%. If a criterion that the 95th percentile estimate equals the TDI is regarded as both providing protection to highly exposed persons and leaving a reasonable margin of exposure relative to the TDI, then the chloroform drinking water criterion could be a concentration of 0.11mg/L. This implies a daily intake equal to 34% of the TDI allocated to the oral intake (2L/d) of drinking water for typical adults. For the highly exposed persons, inhalation exposure via evaporation from water contributed 53% of the total intake, whereas dermal absorption contributed only 3%. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. The stretching amplitude and thermal regime of the lithosphere in the nonvolcanic passive margin of Antarctica in the Mawson Sea region

    NASA Astrophysics Data System (ADS)

    Galushkin, Yu. I.; Leitchenkov, G. L.; Guseva, Yu. B.; Dubinin, E. P.

    2018-01-01

    The burial history and thermal evolution of the lithosphere within the passive nonvolcanic Antarctic margin in the region of the Mawson Sea are numerically reconstructed for the margin areas along the seismic profile 5909 with the use of the GALO basin modeling system. The amplitudes of the lithosphere stretching at the different stages of continental rifting which took place from 160 to 90 Ma ago are calculated from the geophysical estimates of the thickness of the consolidated crust and the tectonic analysis of the variations in the thickness of the sedimentary cover and sea depths during the evolution of the basin. It is hypothesized that the formation of the recent sedimentary section sequence in the studied region of the Antarctic margin began 140 Ma ago on a basement that was thinned by a factor of 1.6 to 4.5 during the first episode of margin stretching (160-140 Ma) under a fairly high heat flux. The reconstruction of the thermal regime of the lithosphere has shown that the mantle rocks could occur within the temperature interval of serpentinization and simultaneously within the time interval of lithospheric stretching (-160 < t <-90 Ma) only within separate segments of profile 5909 in the Mawson Sea. The calculations of the rock strength distribution with depth by the example of the section of pseudowell 4 have shown that a significant part of the crust and uppermost mantle fall here in the region of brittle deformations in the most recent period of lithosphere stretching (-104 to-90 Ma ago). The younger basin segments of profile 5909 in the region of pseudowells 5 and 6 are characterized by a high heat flux, and the formation of through-thickness brittle fractures in these zones is less probable. However, serpentinization could take place in these areas as in the other margin segments at the stage of presedimentation ultra slow basement stretching.

  18. Realistic respiratory motion margins for external beam partial breast irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conroy, Leigh; Quirk, Sarah; Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4

    Purpose: Respiratory margins for partial breast irradiation (PBI) have been largely based on geometric observations, which may overestimate the margin required for dosimetric coverage. In this study, dosimetric population-based respiratory margins and margin formulas for external beam partial breast irradiation are determined. Methods: Volunteer respiratory data and anterior–posterior (AP) dose profiles from clinical treatment plans of 28 3D conformal radiotherapy (3DCRT) PBI patient plans were used to determine population-based respiratory margins. The peak-to-peak amplitudes (A) of realistic respiratory motion data from healthy volunteers were scaled from A = 1 to 10 mm to create respiratory motion probability density functions. Dosemore » profiles were convolved with the respiratory probability density functions to produce blurred dose profiles accounting for respiratory motion. The required margins were found by measuring the distance between the simulated treatment and original dose profiles at the 95% isodose level. Results: The symmetric dosimetric respiratory margins to cover 90%, 95%, and 100% of the simulated treatment population were 1.5, 2, and 4 mm, respectively. With patient set up at end exhale, the required margins were larger in the anterior direction than the posterior. For respiratory amplitudes less than 5 mm, the population-based margins can be expressed as a fraction of the extent of respiratory motion. The derived formulas in the anterior/posterior directions for 90%, 95%, and 100% simulated population coverage were 0.45A/0.25A, 0.50A/0.30A, and 0.70A/0.40A. The differences in formulas for different population coverage criteria demonstrate that respiratory trace shape and baseline drift characteristics affect individual respiratory margins even for the same average peak-to-peak amplitude. Conclusions: A methodology for determining population-based respiratory margins using real respiratory motion patterns and dose profiles in the AP direction was described. It was found that the currently used respiratory margin of 5 mm in partial breast irradiation may be overly conservative for many 3DCRT PBI patients. Amplitude alone was found to be insufficient to determine patient-specific margins: individual respiratory trace shape and baseline drift both contributed to the dosimetric target coverage. With respiratory coaching, individualized respiratory margins smaller than the full extent of motion could reduce planning target volumes while ensuring adequate coverage under respiratory motion.« less

  19. Nonlinear Spatial Inversion Without Monte Carlo Sampling

    NASA Astrophysics Data System (ADS)

    Curtis, A.; Nawaz, A.

    2017-12-01

    High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable, so these do not need to be estimated from samples as is required in MC methods. On a 2-D test example the method is shown to outperform previous methods significantly, and at a fraction of the computational cost. In many foreseeable applications there are therefore no serious impediments to extending the method to 3-D spatial models.

  20. A Bivariate Mixed Distribution with a Heavy-tailed Component and its Application to Single-site Daily Rainfall Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.

    2013-02-06

    This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing lowmore » to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way. Another interesting observation is that the recognized ‘overdispersion’ problem in daily rainfall simulation ascribes more to the loss of rainfall extremes than the under-representation of first-order persistence. The developed generator appears to be a sound option for daily rainfall simulation, especially in particular hydrologic planning situations when rare rainfall events are of great importance.« less

  1. Integrated drought risk assessment of multi-hazard-affected bodies based on copulas in the Taoerhe Basin, China

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Zhang, Jiquan; Guo, Enliang; Alu, Si; Li, Danjun; Ha, Si; Dong, Zhenhua

    2018-02-01

    Along with global warming, drought disasters are occurring more frequently and are seriously affecting normal life and food security in China. Drought risk assessments are necessary to provide support for local governments. This study aimed to establish an integrated drought risk model based on the relation curve of drought joint probabilities and drought losses of multi-hazard-affected bodies. First, drought characteristics, including duration and severity, were classified using the 1953-2010 precipitation anomaly in the Taoerhe Basin based on run theory, and their marginal distributions were identified by exponential and Gamma distributions, respectively. Then, drought duration and severity were related to construct a joint probability distribution based on the copula function. We used the EPIC (Environmental Policy Integrated Climate) model to simulate maize yield and historical data to calculate the loss rates of agriculture, industry, and animal husbandry in the study area. Next, we constructed vulnerability curves. Finally, the spatial distributions of drought risk for 10-, 20-, and 50-year return periods were expressed using inverse distance weighting. Our results indicate that the spatial distributions of the three return periods are consistent. The highest drought risk is in Ulanhot, and the duration and severity there were both highest. This means that higher drought risk corresponds to longer drought duration and larger drought severity, thus providing useful information for drought and water resource management. For 10-, 20-, and 50-year return periods, the drought risk values ranged from 0.41 to 0.53, 0.45 to 0.59, and 0.50 to 0.67, respectively. Therefore, when the return period increases, the drought risk increases.

  2. Two-part models with stochastic processes for modelling longitudinal semicontinuous data: Computationally efficient inference and modelling the overall marginal mean.

    PubMed

    Yiu, Sean; Tom, Brian Dm

    2017-01-01

    Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.

  3. Stochastic static fault slip inversion from geodetic data with non-negativity and bounds constraints

    NASA Astrophysics Data System (ADS)

    Nocquet, J.-M.

    2018-04-01

    Despite surface displacements observed by geodesy are linear combinations of slip at faults in an elastic medium, determining the spatial distribution of fault slip remains a ill-posed inverse problem. A widely used approach to circumvent the illness of the inversion is to add regularization constraints in terms of smoothing and/or damping so that the linear system becomes invertible. However, the choice of regularization parameters is often arbitrary, and sometimes leads to significantly different results. Furthermore, the resolution analysis is usually empirical and cannot be made independently of the regularization. The stochastic approach of inverse problems (Tarantola & Valette 1982; Tarantola 2005) provides a rigorous framework where the a priori information about the searched parameters is combined with the observations in order to derive posterior probabilities of the unkown parameters. Here, I investigate an approach where the prior probability density function (pdf) is a multivariate Gaussian function, with single truncation to impose positivity of slip or double truncation to impose positivity and upper bounds on slip for interseismic modeling. I show that the joint posterior pdf is similar to the linear untruncated Gaussian case and can be expressed as a Truncated Multi-Variate Normal (TMVN) distribution. The TMVN form can then be used to obtain semi-analytical formulas for the single, two-dimensional or n-dimensional marginal pdf. The semi-analytical formula involves the product of a Gaussian by an integral term that can be evaluated using recent developments in TMVN probabilities calculations (e.g. Genz & Bretz 2009). Posterior mean and covariance can also be efficiently derived. I show that the Maximum Posterior (MAP) can be obtained using a Non-Negative Least-Squares algorithm (Lawson & Hanson 1974) for the single truncated case or using the Bounded-Variable Least-Squares algorithm (Stark & Parker 1995) for the double truncated case. I show that the case of independent uniform priors can be approximated using TMVN. The numerical equivalence to Bayesian inversions using Monte Carlo Markov Chain (MCMC) sampling is shown for a synthetic example and a real case for interseismic modeling in Central Peru. The TMVN method overcomes several limitations of the Bayesian approach using MCMC sampling. First, the need of computer power is largely reduced. Second, unlike Bayesian MCMC based approach, marginal pdf, mean, variance or covariance are obtained independently one from each other. Third, the probability and cumulative density functions can be obtained with any density of points. Finally, determining the Maximum Posterior (MAP) is extremely fast.

  4. Monte-Carlo model development for evaluation of current clinical target volume definition for heterogeneous and hypoxic glioblastoma.

    PubMed

    Moghaddasi, L; Bezak, E; Harriss-Phillips, W

    2016-05-07

    Clinical target volume (CTV) determination may be complex and subjective. In this work a microscopic-scale tumour model was developed to evaluate current CTV practices in glioblastoma multiforme (GBM) external radiotherapy. Previously, a Geant4 cell-based dosimetry model was developed to calculate the dose deposited in individual GBM cells. Microscopic extension probability (MEP) models were then developed using Matlab-2012a. The results of the cell-based dosimetry model and MEP models were combined to calculate survival fractions (SF) for CTV margins of 2.0 and 2.5 cm. In the current work, oxygenation and heterogeneous radiosensitivity profiles were incorporated into the GBM model. The genetic heterogeneity was modelled using a range of α/β values (linear-quadratic model parameters) associated with different GBM cell lines. These values were distributed among the cells randomly, taken from a Gaussian-weighted sample of α/β values. Cellular oxygen pressure was distributed randomly taken from a sample weighted to profiles obtained from literature. Three types of GBM models were analysed: homogeneous-normoxic, heterogeneous-normoxic, and heterogeneous-hypoxic. The SF in different regions of the tumour model and the effect of the CTV margin extension from 2.0-2.5 cm on SFs were investigated for three MEP models. The SF within the beam was increased by up to three and two orders of magnitude following incorporation of heterogeneous radiosensitivities and hypoxia, respectively, in the GBM model. However, the total SF was shown to be overdominated by the presence of tumour cells in the penumbra region and to a lesser extent by genetic heterogeneity and hypoxia. CTV extension by 0.5 cm reduced the SF by a maximum of 78.6  ±  3.3%, 78.5  ±  3.3%, and 77.7  ±  3.1% for homogeneous and heterogeneous-normoxic, and heterogeneous hypoxic GBMs, respectively. Monte-Carlo model was developed to quantitatively evaluate SF for genetically heterogeneous and hypoxic GBM with two CTV margins and three MEP distributions. The results suggest that photon therapy may not provide cure for hypoxic and genetically heterogeneous GBM. However, the extension of the CTV margin by 0.5 cm could be beneficial to delay the recurrence time for this tumour type due to significant increase in tumour cell irradiation.

  5. Functional anatomy controls ion distribution in banana leaves: significance of Na+ seclusion at the leaf margins.

    PubMed

    Shapira, Or; Khadka, Sudha; Israeli, Yair; Shani, Uri; Schwartz, Amnon

    2009-05-01

    Typical salt stress symptoms appear in banana (Musa sp., cv. 'Grand Nain' AAA) only along the leaf margins. Mineral analysis of the dry matter of plants treated with increasing concentrations of KCl or NaCl revealed significant accumulation of Na+, but not of K+ or Cl(-), in the affected leaf margins. The differential distribution of the three ions suggests that water and ion movement out of the xylem is mostly symplastic and, in contrast to K+ and Cl(-), there exists considerable resistance to the flow of Na+ from the xylem to the adjacent mesophyll and epidermis. The parallel veins of the lamina are enclosed by several layers of bundle sheath parenchyma; in contrast, the large vascular bundle that encircles the entire lamina, and into which the parallel veins merge, lacks a complete bundle sheath. Xylem sap containing a high concentration of Na+ is 'pulled' by water tension from the marginal vein back into the adjacent mesophyll without having to cross a layer of parenchyma tissue. When the marginal vein was dissected from the lamina, the pattern of Na+ distribution in the margins changed markedly. The distinct anatomy of the marginal vein plays a major role in the accumulation of Na+ in the margins, with the latter serving as a 'dumping site' for toxic molecules.

  6. INTEGRATION OF RELIABILITY WITH MECHANISTIC THERMALHYDRAULICS: REPORT ON APPROACH AND TEST PROBLEM RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. S. Schroeder; R. W. Youngblood

    The Risk-Informed Safety Margin Characterization (RISMC) pathway of the Light Water Reactor Sustainability Program is developing simulation-based methods and tools for analyzing safety margin from a modern perspective. [1] There are multiple definitions of 'margin.' One class of definitions defines margin in terms of the distance between a point estimate of a given performance parameter (such as peak clad temperature), and a point-value acceptance criterion defined for that parameter (such as 2200 F). The present perspective on margin is that it relates to the probability of failure, and not just the distance between a nominal operating point and a criterion.more » In this work, margin is characterized through a probabilistic analysis of the 'loads' imposed on systems, structures, and components, and their 'capacity' to resist those loads without failing. Given the probabilistic load and capacity spectra, one can assess the probability that load exceeds capacity, leading to component failure. Within the project, we refer to a plot of these probabilistic spectra as 'the logo.' Refer to Figure 1 for a notional illustration. The implications of referring to 'the logo' are (1) RISMC is focused on being able to analyze loads and spectra probabilistically, and (2) calling it 'the logo' tacitly acknowledges that it is a highly simplified picture: meaningful analysis of a given component failure mode may require development of probabilistic spectra for multiple physical parameters, and in many practical cases, 'load' and 'capacity' will not vary independently.« less

  7. Gaussianization for fast and accurate inference from cosmological data

    NASA Astrophysics Data System (ADS)

    Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.

    2016-06-01

    We present a method to transform multivariate unimodal non-Gaussian posterior probability densities into approximately Gaussian ones via non-linear mappings, such as Box-Cox transformations and generalizations thereof. This permits an analytical reconstruction of the posterior from a point sample, like a Markov chain, and simplifies the subsequent joint analysis with other experiments. This way, a multivariate posterior density can be reported efficiently, by compressing the information contained in Markov Chain Monte Carlo samples. Further, the model evidence integral (I.e. the marginal likelihood) can be computed analytically. This method is analogous to the search for normal parameters in the cosmic microwave background, but is more general. The search for the optimally Gaussianizing transformation is performed computationally through a maximum-likelihood formalism; its quality can be judged by how well the credible regions of the posterior are reproduced. We demonstrate that our method outperforms kernel density estimates in this objective. Further, we select marginal posterior samples from Planck data with several distinct strongly non-Gaussian features, and verify the reproduction of the marginal contours. To demonstrate evidence computation, we Gaussianize the joint distribution of data from weak lensing and baryon acoustic oscillations, for different cosmological models, and find a preference for flat Λcold dark matter. Comparing to values computed with the Savage-Dickey density ratio, and Population Monte Carlo, we find good agreement of our method within the spread of the other two.

  8. Cluster Adjusted Regression for Displaced Subject Data (CARDS): Marginal Inference under Potentially Informative Temporal Cluster Size Profiles

    PubMed Central

    Bible, Joe; Beck, James D.; Datta, Somnath

    2016-01-01

    Summary Ignorance of the mechanisms responsible for the availability of information presents an unusual problem for analysts. It is often the case that the availability of information is dependent on the outcome. In the analysis of cluster data we say that a condition for informative cluster size (ICS) exists when the inference drawn from analysis of hypothetical balanced data varies from that of inference drawn on observed data. Much work has been done in order to address the analysis of clustered data with informative cluster size; examples include Inverse Probability Weighting (IPW), Cluster Weighted Generalized Estimating Equations (CWGEE), and Doubly Weighted Generalized Estimating Equations (DWGEE). When cluster size changes with time, i.e., the data set possess temporally varying cluster sizes (TVCS), these methods may produce biased inference for the underlying marginal distribution of interest. We propose a new marginalization that may be appropriate for addressing clustered longitudinal data with TVCS. The principal motivation for our present work is to analyze the periodontal data collected by Beck et al. (1997, Journal of Periodontal Research 6, 497–505). Longitudinal periodontal data often exhibits both ICS and TVCS as the number of teeth possessed by participants at the onset of study is not constant and teeth as well as individuals may be displaced throughout the study. PMID:26682911

  9. Detrital Zircons Split Sibumasu in East Gondwana

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Chung, S. L.

    2017-12-01

    It is widely accepted that Sibumasu developed as a united terrane and originated from NW Australian margin in East Gondwana. Here we report new detrital zircon U-Pb-Hf isotopic data from Sumatra that, in combination with literature data, challenge and refute the above long-held view. In particular, the East and West Sumatra terranes share nearly identical Precambrian to Paleozoic detrital zircon age distributions and Hf isotopes, indicating a common provenance/origin for them. The Sumatra detrital zircons exhibit a prominent population of ca. 1170-1070 Ma, indistinguishable from those of the Lhasa and West Burma terranes, with detritus most probably sourcing from western Australia. By contrast, Sibuma (Sibumasu excluding Sumatra) detrital zircons display a prevailing population of ca. 980-935 Ma, strongly resembling those of the western Qiangtang terrane, with detrital materials most likely derived from Greater India and Himalayas. Such markedly distinct detrital zircon age profiles between Sumatra and Sibuma require disparate sources/origin for them, provoking disintegration of the widely-adopted, but outdated, term Sibumasu and thus inviting a new configuration of East Gondwana in the early Paleozoic, with Sumatra and West Burma lying outboard the Lhasa terrane in the NW Australian margin and Sibuma situated in the northern Greater Indian margin. More future investigations are needed to establish the precise rifting and drifting histories of Sumatra and Sibuma, as two separated terranes, during the breakup of Gondwana.

  10. Estimation of the marginal effect of regular drug use on multiple sclerosis in the Iranian population.

    PubMed

    Abdollahpour, Ibrahim; Nedjat, Saharnaz; Mansournia, Mohammad Ali; Schuster, Tibor

    2018-01-01

    There are only few reports regarding the role of lifetime drug or substance use in multiple sclerosis (MS) etiology. In this study, we investigated the potential effect of drug or substance exposure on the onset of MS diagnosis. We conducted a population-based incident case control study in Tehran. Cases (n = 547) were 15-50 years old persons with MS identified from the Iranian Multiple Sclerosis Society (IMSS) register during August 7, 2013, and November 17, 2015. Population-based controls (n = 1057) were 15-50 years old and were recruited by random digit telephone dialing. Inverse-probability-of-treatment weighing (IPTW) using two sets of propensity scores (PSs) was used to estimate marginal incidence odds ratios (ORs) for MS contrasting pre-specified substance use. The estimated marginal OR was 6.03 (95% confidence interval: 3.54;10.3, using trimmed weights at the 95th percentile of the stabilized weight distribution) in both IPTW analyses comparing lifetime substance use (opioids, cannabis, inhalants, hallucinogens and stimulants) for at least one time monthly during a six-months or longer period vs. no such history of drug use. Subject to limitation of causal claims based on case-control studies, this study suggests that monthly drug or substance use for a period of at least six consecutive months, may increase the risk of MS by factor 3.5 or higher.

  11. Quantifying uncertainty in geoacoustic inversion. II. Application to broadband, shallow-water data.

    PubMed

    Dosso, Stan E; Nielsen, Peter L

    2002-01-01

    This paper applies the new method of fast Gibbs sampling (FGS) to estimate the uncertainties of seabed geoacoustic parameters in a broadband, shallow-water acoustic survey, with the goal of interpreting the survey results and validating the method for experimental data. FGS applies a Bayesian approach to geoacoustic inversion based on sampling the posterior probability density to estimate marginal probability distributions and parameter covariances. This requires knowledge of the statistical distribution of the data errors, including both measurement and theory errors, which is generally not available. Invoking the simplifying assumption of independent, identically distributed Gaussian errors allows a maximum-likelihood estimate of the data variance and leads to a practical inversion algorithm. However, it is necessary to validate these assumptions, i.e., to verify that the parameter uncertainties obtained represent meaningful estimates. To this end, FGS is applied to a geoacoustic experiment carried out at a site off the west coast of Italy where previous acoustic and geophysical studies have been performed. The parameter uncertainties estimated via FGS are validated by comparison with: (i) the variability in the results of inverting multiple independent data sets collected during the experiment; (ii) the results of FGS inversion of synthetic test cases designed to simulate the experiment and data errors; and (iii) the available geophysical ground truth. Comparisons are carried out for a number of different source bandwidths, ranges, and levels of prior information, and indicate that FGS provides reliable and stable uncertainty estimates for the geoacoustic inverse problem.

  12. Volcanic and nonvolcanic rifted margins of the Red Sea and Gulf of Aden: Crustal cooling and margin evolution in Yemen

    NASA Astrophysics Data System (ADS)

    Menzies, Martin; Gallagher, Kerry; Yelland, Andrew; Hurford, Anthony J.

    1997-06-01

    New apatite fission track (AFT) data from the southern Red Sea volcanic and the Gulf of Aden nonvolcanic margins provide important constraints on the timing of crustal cooling relative to periods of volcanism and lithosphere extension. The AFT data define several regions of extension immediately adjacent to the Red Sea margin with AFT ages < 25 Ma and track-length distributions consistent with rapid cooling. Elevated Precambrian basement highs on the rift shoulder have AFT ages ≫ 100 Ma and track-length distributions indicative of a complex pre-rift history. An intervening area along the Red Sea and Gulf of Aden margins, and inland along the Balhaf graben (Jurassic rift), has AFT ages of 25-100 Ma. and track-length distributions indicative of rapid cooling. Elevated Precambrian basement highs are juxtaposed against topographically lower extended coastal terranes with sharp contrasts in AFT ages and track-length distributions, pointing to possible reactivation in the Tertiary of lineaments of Precambrian and Jurassic age. Integration of field observations with AFT data and 40Ar/ 39Ar data indicates that, on the Red Sea volcanic margin, surface uplift was initiated immediately prior to volcanism and that cooling was synchronous with widespread extension and an apparent hiatus in voluminous volcanic activity.

  13. Tectonic types of marginal and inner seas; their place in the development of the crust

    NASA Astrophysics Data System (ADS)

    Khain, V. E.; Levin, L. E.

    1980-12-01

    Inner and marginal deep seas are of considerable interest not only for their genesis but also as "micromodels" of oceans. In the latter case it must be noted that some of them essentially differ from oceans in several parameters. They have a shorter period of development, thicker sedimentary cover, less distinct linear magnetic anomalies or an absence of them, high heat-flow values and seismic activity over their whole area. Consequently, the analogy with the oceans has certain limitations as the deep structure of such seas is not homogeneous and they probably vary in genesis. Only a few marginal seas are cut off from the principal areas of the oceans by island arcs formed, most probably, along transform faults. The origin of this type is more or less reliably demonstrated for the Bering Sea. Other types of marginal seas are more numerous. Some of them (such as the Gulf of Aden and the Gulf of California) are embryonic apophyses connected with the oceans. Others are atrophied (the Tasman and the Labrador seas) small oceans. The group of marginal and inner seas which lie in the inside zone of mature or young island arcs is even more numerous. Only a few basins of this group resulted from linear spreading imprinted in the system of magnetic anomalies (the Shikoku-Parese-Vela basin), the rest are supposed to have been formed in the process of diffusal or polyaxial spreading of recent time as in Afar. The majority of inner and marginal seas are younger than recent oceans. They are formed by rifting, oriented crosswise to continental margins of the Atlantic type or along the strike of margins of Andean type. More ancient basins of marginal and inner seas have been involved in Phanerozoic orogens or more rarely became parts of platforms (Ciscaspian syneclise).

  14. Small-target leak detection for a closed vessel via infrared image sequences

    NASA Astrophysics Data System (ADS)

    Zhao, Ling; Yang, Hongjiu

    2017-03-01

    This paper focus on a leak diagnosis and localization method based on infrared image sequences. Some problems on high probability of false warning and negative affect for marginal information are solved by leak detection. An experimental model is established for leak diagnosis and localization on infrared image sequences. The differential background prediction is presented to eliminate the negative affect of marginal information on test vessel based on a kernel regression method. A pipeline filter based on layering voting is designed to reduce probability of leak point false warning. A synthesize leak diagnosis and localization algorithm is proposed based on infrared image sequences. The effectiveness and potential are shown for developed techniques through experimental results.

  15. Plasma Oscillation Characterization of NASA's HERMeS Hall Thruster via High Speed Imaging

    NASA Technical Reports Server (NTRS)

    Huang, Wensheng; Kamhawi, Hani; Haag, Thomas W.

    2016-01-01

    The performance and facility effect characterization tests of NASA's 12.5-kW Hall Effect Rocket with Magnetic Shielding had been completed. As a part of these tests, three plasma oscillation characterization studies were performed to help determine operation settings and quantify margins. The studies included the magnetic field strength variation study, background pressure effect study, and cathode flow fraction study. Separate high-speed videos of the thruster including the cathode and of only the cathode were recorded. Breathing mode at 10-15 kHz and cathode gradient-driven mode at 60-75 kHz were observed. An additional high frequency (40-70 kHz) global oscillation mode with sinusoidal probability distribution function was identified.

  16. Stochastic investigation of wind process for climatic variability identification

    NASA Astrophysics Data System (ADS)

    Deligiannis, Ilias; Tyrogiannis, Vassilis; Daskalou, Olympia; Dimitriadis, Panayiotis; Markonis, Yannis; Iliopoulou, Theano; Koutsoyiannis, Demetris

    2016-04-01

    The wind process is considered one of the hydrometeorological processes that generates and drives the climate dynamics. We use a dataset comprising hourly wind records to identify statistical variability with emphasis on the last period. Specifically, we investigate the occurrence of mean, maximum and minimum values and we estimate statistical properties such as marginal probability distribution function and the type of decay of the climacogram (i.e., mean process variance vs. scale) for various time periods. Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.

  17. Stochastic investigation of precipitation process for climatic variability identification

    NASA Astrophysics Data System (ADS)

    Sotiriadou, Alexia; Petsiou, Amalia; Feloni, Elisavet; Kastis, Paris; Iliopoulou, Theano; Markonis, Yannis; Tyralis, Hristos; Dimitriadis, Panayiotis; Koutsoyiannis, Demetris

    2016-04-01

    The precipitation process is important not only to hydrometeorology but also to renewable energy resources management. We use a dataset consisting of daily and hourly records around the globe to identify statistical variability with emphasis on the last period. Specifically, we investigate the occurrence of mean, maximum and minimum values and we estimate statistical properties such as marginal probability distribution function and the type of decay of the climacogram (i.e., mean process variance vs. scale). Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.

  18. Evaluating the Potential of Marginal Land for Cellulosic Feedstock Production and Carbon Sequestration in the United States.

    PubMed

    Emery, Isaac; Mueller, Steffen; Qin, Zhangcai; Dunn, Jennifer B

    2017-01-03

    Land availability for growing feedstocks at scale is a crucial concern for the bioenergy industry. Feedstock production on land not well-suited to growing conventional crops, or marginal land, is often promoted as ideal, although there is a poor understanding of the qualities, quantity, and distribution of marginal lands in the United States. We examine the spatial distribution of land complying with several key marginal land definitions at the United States county, agro-ecological zone, and national scales, and compare the ability of both marginal land and land cover data sets to identify regions for feedstock production. We conclude that very few land parcels comply with multiple definitions of marginal land. Furthermore, to examine possible carbon-flow implications of feedstock production on land that could be considered marginal per multiple definitions, we model soil carbon changes upon transitions from marginal cropland, grassland, and cropland-pastureland to switchgrass production for three marginal land-rich counties. Our findings suggest that total soil organic carbon changes per county are small, and generally positive, and can influence life-cycle greenhouse gas emissions of switchgrass ethanol.

  19. Copula-based assessment of the relationship between food peaks and flood volumes using information on historical floods by Bayesian Monte Carlo Markov Chain simulations

    NASA Astrophysics Data System (ADS)

    Gaál, Ladislav; Szolgay, Ján.; Bacigál, Tomáå.¡; Kohnová, Silvia

    2010-05-01

    Copula-based estimation methods of hydro-climatological extremes have increasingly been gaining attention of researchers and practitioners in the last couple of years. Unlike the traditional estimation methods which are based on bivariate cumulative distribution functions (CDFs), copulas are a relatively flexible tool of statistics that allow for modelling dependencies between two or more variables such as flood peaks and flood volumes without making strict assumptions on the marginal distributions. The dependence structure and the reliability of the joint estimates of hydro-climatological extremes, mainly in the right tail of the joint CDF not only depends on the particular copula adopted but also on the data available for the estimation of the marginal distributions of the individual variables. Generally, data samples for frequency modelling have limited temporal extent, which is a considerable drawback of frequency analyses in practice. Therefore, it is advised to deal with statistical methods that improve any part of the process of copula construction and result in more reliable design values of hydrological variables. The scarcity of the data sample mostly in the extreme tail of the joint CDF can be bypassed, e.g., by using a considerably larger amount of simulated data by rainfall-runoff analysis or by including historical information on the variables under study. The latter approach of data extension is used here to make the quantile estimates of the individual marginals of the copula more reliable. In the presented paper it is proposed to use historical information in the frequency analysis of the marginal distributions in the framework of Bayesian Monte Carlo Markov Chain (MCMC) simulations. Generally, a Bayesian approach allows for a straightforward combination of different sources of information on floods (e.g. flood data from systematic measurements and historical flood records, respectively) in terms of a product of the corresponding likelihood functions. On the other hand, the MCMC algorithm is a numerical approach for sampling from the likelihood distributions. The Bayesian MCMC methods therefore provide an attractive way to estimate the uncertainty in parameters and quantile metrics of frequency distributions. The applicability of the method is demonstrated in a case study of the hydroelectric power station Orlík on the Vltava River. This site has a key role in the flood prevention of Prague, the capital city of the Czech Republic. The record length of the available flood data is 126 years from the period 1877-2002, while the flood event observed in 2002 that caused extensive damages and numerous casualties is treated as a historic one. To estimate the joint probabilities of flood peaks and volumes, different copulas are fitted and their goodness-of-fit are evaluated by bootstrap simulations. Finally, selected quantiles of flood volumes conditioned on given flood peaks are derived and compared with those obtained by the traditional method used in the practice of water management specialists of the Vltava River.

  20. Evolution of consumption distribution and model of wealth distribution in China between 1995 and 2012

    NASA Astrophysics Data System (ADS)

    Gao, Li

    2015-07-01

    We study the evolution of the distribution of consumption of individuals in the majority population in China during the period 1995-2012 and find that its probability density functions (PDFs) obey the rule Pc(x) = K(x - μ) e-(x - μ)2/2σ2. We also find (i) that the PDFs and the individual income distribution appear to be identical, (ii) that the peaks of the PDFs of the individual consumption distribution are consistently on the low side of the PDFs of the income distribution, and (iii) that the average of the marginal propensity to consume (MPC) is large, MPC bar = 0.77, indicating that in the majority population individual consumption is low and strongly dependent on income. The long right tail of the PDFs of consumption indicates that few people in China are participating in the high consumption economy, and that consumption inequality is high. After comparing the PDFs of consumption with the PDFs of income we obtain the PDFs of residual wealth during the period 1995-2012, which exhibit a Gaussian distribution. We use an agent-based kinetic wealth-exchange model (KWEM) to simulate this evolutional process and find that this Gaussian distribution indicates a strong propensity to save rather than spend. This may be due to an anticipation of such large potential outlays as housing, education, and health care in the context of an inadequate welfare support system.

  1. Joint radius-length distribution as a measure of anisotropic pore eccentricity: an experimental and analytical framework.

    PubMed

    Benjamini, Dan; Basser, Peter J

    2014-12-07

    In this work, we present an experimental design and analytical framework to measure the nonparametric joint radius-length (R-L) distribution of an ensemble of parallel, finite cylindrical pores, and more generally, the eccentricity distribution of anisotropic pores. Employing a novel 3D double pulsed-field gradient acquisition scheme, we first obtain both the marginal radius and length distributions of a population of cylindrical pores and then use these to constrain and stabilize the estimate of the joint radius-length distribution. Using the marginal distributions as constraints allows the joint R-L distribution to be reconstructed from an underdetermined system (i.e., more variables than equations), which requires a relatively small and feasible number of MR acquisitions. Three simulated representative joint R-L distribution phantoms corrupted by different noise levels were reconstructed to demonstrate the process, using this new framework. As expected, the broader the peaks in the joint distribution, the less stable and more sensitive to noise the estimation of the marginal distributions. Nevertheless, the reconstruction of the joint distribution is remarkably robust to increases in noise level; we attribute this characteristic to the use of the marginal distributions as constraints. Axons are known to exhibit local compartment eccentricity variations upon injury; the extent of the variations depends on the severity of the injury. Nonparametric estimation of the eccentricity distribution of injured axonal tissue is of particular interest since generally one cannot assume a parametric distribution a priori. Reconstructing the eccentricity distribution may provide vital information about changes resulting from injury or that occurred during development.

  2. Evaluating Oilseed Biofuel Production Feasibility in California’s San Joaquin Valley Using Geophysical and Remote Sensing Techniques

    PubMed Central

    Corwin, Dennis L.; Yemoto, Kevin; Clary, Wes; Banuelos, Gary; Skaggs, Todd H.; Lesch, Scott M.

    2017-01-01

    Though more costly than petroleum-based fuels and a minor component of overall military fuel sources, biofuels are nonetheless strategically valuable to the military because of intentional reliance on multiple, reliable, secure fuel sources. Significant reduction in oilseed biofuel cost occurs when grown on marginally productive saline-sodic soils plentiful in California’s San Joaquin Valley (SJV). The objective is to evaluate the feasibility of oilseed production on marginal soils in the SJV to support a 115 ML yr−1 biofuel conversion facility. The feasibility evaluation involves: (1) development of an Ida Gold mustard oilseed yield model for marginal soils; (2) identification of marginally productive soils; (3) development of a spatial database of edaphic factors influencing oilseed yield and (4) performance of Monte Carlo simulations showing potential biofuel production on marginally productive SJV soils. The model indicates oilseed yield is related to boron, salinity, leaching fraction, and water content at field capacity. Monte Carlo simulations for the entire SJV fit a shifted gamma probability density function: Q = 68.986 + gamma (6.134,5.285), where Q is biofuel production in ML yr−1. The shifted gamma cumulative density function indicates a 0.15–0.17 probability of meeting the target biofuel-production level of 115 ML yr−1, making adequate biofuel production unlikely. PMID:29036925

  3. Contrasting upper-mantle shear wave anisotropy across the transpressive Queen Charlotte margin

    NASA Astrophysics Data System (ADS)

    Cao, Lingmin; Kao, Honn; Wang, Kelin

    2017-10-01

    In order to investigate upper mantle and crustal anisotropy along the transpressive Queen Charlotte margin between the Pacific (PA) and North America (NA) plates, we conducted shear wave splitting analyses using 17 seismic stations in and around the island of Haida Gwaii, Canada. Despite the limited station coverage at present, our reconnaissance study does reveal a systematic pattern of mantle anisotropy in this region. Fast directions derived from teleseismic SKS-phase splitting are mostly margin-parallel (NNW-SSE) near the plate boundary but transition to predominantly E-W-trending farther away. We propose that the former is associated with the absolute motion of PA, and the latter reflects a transition from this direction to that of the absolute motion of NA. The broad width of the zone of transition from the PA to NA direction is probably caused by the very obliquely subducting PA slab that travels primarily in the margin-parallel direction. Anisotropy of Haida Gwaii based on local earthquakes features a fast direction that cannot be explained with regional stresses and is probably associated with local structural fabric within the overriding crust. Our preliminary shear wave splitting measurements and working hypotheses based on them will serve to guide more refined future studies to unravel details of the geometry and kinematics of the subducted PA slab, as well as the viscous coupling between the slab and upper mantle in other transpressive margins.

  4. Changes in Arctic Sea Ice Floe Size Distribution in the Marginal Ice Zone in a Thickness and Floe Size Distribution Model

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Stern, H. L., III; Hwang, P. B.; Schweiger, A. J. B.; Stark, M.; Steele, M.

    2015-12-01

    To better describe the state of sea ice in the marginal ice zone (MIZ) with floes of varying thicknesses and sizes, both an ice thickness distribution (ITD) and a floe size distribution (FSD) are needed. We have developed a FSD theory [Zhang et al., 2015] that is coupled to the ITD theory of Thorndike et al. [1975] in order to explicitly simulate the evolution of FSD and ITD jointly. The FSD theory includes a FSD function and a FSD conservation equation in parallel with the ITD equation. The FSD equation takes into account changes in FSD due to ice advection, thermodynamic growth, and lateral melting. It also includes changes in FSD because of mechanical redistribution of floe size due to ice opening, ridging and, particularly, ice fragmentation induced by stochastic ocean surface waves. The floe size redistribution due to ice fragmentation is based on the assumption that wave-induced breakup is a random process such that when an ice floe is broken, floes of any smaller sizes have an equal opportunity to form, without being either favored or excluded. It is also based on the assumption that floes of larger sizes are easier to break because they are subject to larger flexure-induced stresses and strains than smaller floes that are easier to ride with waves with little bending; larger floes also have higher areal coverages and therefore higher probabilities to break. These assumptions with corresponding formulations ensure that the simulated FSD follows a power law as observed by satellites and airborne surveys. The FSD theory has been tested in the Pan-arctic Ice/Ocean Modeling and Assimilation System (PIOMAS). The existing PIOMAS has 12 categories each for ice thickness, ice enthalpy, and snow depth. With the implementation of the FSD theory, PIOMAS is able to represent 12 categories of floe sizes ranging from 0.1 m to ~3000 m. It is found that the simulated 12-category FSD agrees reasonably well with FSD derived from SAR and MODIS images. In this study, we will examine PIOMAS-estimated variability and changes in Arctic FSD over the period 1979-present. Thorndike, A. S., D. A. Rothrock, G. A. Maykut, and R. Colony, The thickness distribution of sea ice. J. Geophys. Res., 80, 1975. Zhang, J., A. Schweiger, M. Steele, and H. Stern, Sea ice floe size distribution in the marginal ice zone: Theory and numerical experiments, J. Geophys. Res., 120, 2015.

  5. Non-Gaussian probabilistic MEG source localisation based on kernel density estimation☆

    PubMed Central

    Mohseni, Hamid R.; Kringelbach, Morten L.; Woolrich, Mark W.; Baker, Adam; Aziz, Tipu Z.; Probert-Smith, Penny

    2014-01-01

    There is strong evidence to suggest that data recorded from magnetoencephalography (MEG) follows a non-Gaussian distribution. However, existing standard methods for source localisation model the data using only second order statistics, and therefore use the inherent assumption of a Gaussian distribution. In this paper, we present a new general method for non-Gaussian source estimation of stationary signals for localising brain activity from MEG data. By providing a Bayesian formulation for MEG source localisation, we show that the source probability density function (pdf), which is not necessarily Gaussian, can be estimated using multivariate kernel density estimators. In the case of Gaussian data, the solution of the method is equivalent to that of widely used linearly constrained minimum variance (LCMV) beamformer. The method is also extended to handle data with highly correlated sources using the marginal distribution of the estimated joint distribution, which, in the case of Gaussian measurements, corresponds to the null-beamformer. The proposed non-Gaussian source localisation approach is shown to give better spatial estimates than the LCMV beamformer, both in simulations incorporating non-Gaussian signals, and in real MEG measurements of auditory and visual evoked responses, where the highly correlated sources are known to be difficult to estimate. PMID:24055702

  6. Essays on refining markets and environmental policy

    NASA Astrophysics Data System (ADS)

    Oladunjoye, Olusegun Akintunde

    This thesis is comprised of three essays. The first two essays examine empirically the relationship between crude oil price and wholesale gasoline prices in the U.S. petroleum refining industry while the third essay determines the optimal combination of emissions tax and environmental research and development (ER&D) subsidy when firms organize ER&D either competitively or as a research joint venture (RJV). In the first essay, we estimate an error correction model to determine the effects of market structure on the speed of adjustment of wholesale gasoline prices, to crude oil price changes. The results indicate that market structure does not have a strong effect on the dynamics of price adjustment in the three regional markets examined. In the second essay, we allow for inventories to affect the relationship between crude oil and wholesale gasoline prices by allowing them to affect the probability of regime change in a Markov-switching model of the refining margin. We find that low gasoline inventory increases the probability of switching from the low margin regime to the high margin regime and also increases the probability of staying in the high margin regime. This is consistent with the predictions of the competitive storage theory. In the third essay, we extend the Industrial Organization R&D theory to the determination of optimal environmental policies. We find that RJV is socially desirable. In comparison to competitive ER&D, we suggest that regulators should encourage RJV with a lower emissions tax and higher subsidy as these will lead to the coordination of ER&D activities and eliminate duplication of efforts while firms internalize their technological spillover externality.

  7. Area-Specific Marginal Costing for Electric Utilities: a Case Study of Transmission and Distribution Costs

    NASA Astrophysics Data System (ADS)

    Orans, Ren

    1990-10-01

    Existing procedures used to develop marginal costs for electric utilities were not designed for applications in an increasingly competitive market for electric power. The utility's value of receiving power, or the costs of selling power, however, depend on the exact location of the buyer or seller, the magnitude of the power and the period of time over which the power is used. Yet no electric utility in the United States has disaggregate marginal costs that reflect differences in costs due to the time, size or location of the load associated with their power or energy transactions. The existing marginal costing methods used by electric utilities were developed in response to the Public Utilities Regulatory Policy Act (PURPA) in 1978. The "ratemaking standards" (Title 1) established by PURPA were primarily concerned with the appropriate segmentation of total revenues to various classes-of-service, designing time-of-use rating periods, and the promotion of efficient long-term resource planning. By design, the methods were very simple and inexpensive to implement. Now, more than a decade later, the costing issues facing electric utilities are becoming increasingly complex, and the benefits of developing more specific marginal costs will outweigh the costs of developing this information in many cases. This research develops a framework for estimating total marginal costs that vary by the size, timing, and the location of changes in loads within an electric distribution system. To complement the existing work at the Electric Power Research Institute (EPRI) and Pacific Gas and Electric Company (PGandE) on estimating disaggregate generation and transmission capacity costs, this dissertation focuses on the estimation of distribution capacity costs. While the costing procedure is suitable for the estimation of total (generation, transmission and distribution) marginal costs, the empirical work focuses on the geographic disaggregation of marginal costs related to electric utility distribution investment. The study makes use of data from an actual distribution planning area, located within PGandE's service territory, to demonstrate the important characteristics of this new costing approach. The most significant result of this empirical work is that geographic differences in the cost of capacity in distribution systems can be as much as four times larger than the current system average utility estimates. Furthermore, lumpy capital investment patterns can lead to significant cost differences over time.

  8. Coupling of Waves, Turbulence and Thermodynamics Across the Marginal Ice Zone

    DTIC Science & Technology

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Coupling of Waves, Turbulence and Thermodynamics across...developing Thermodynamically Forced Marginal Ice Zone. Submitted to JGR. Heiles,A. S., NPS thesis, Sep. 2014 Schmidt, B. K., NPS thesis March 2012 Shaw

  9. The Meandering Margin of the Meteorological Moist Tropics

    NASA Astrophysics Data System (ADS)

    Mapes, Brian E.; Chung, Eui Seok; Hannah, Walter M.; Masunaga, Hirohiko; Wimmers, Anthony J.; Velden, Christopher S.

    2018-01-01

    Bimodally distributed column water vapor (CWV) indicates a well-defined moist regime in the Tropics, above a margin value near 48 kg m-2 in current climate (about 80% of column saturation). Maps reveal this margin as a meandering, sinuous synoptic contour bounding broad plateaus of the moist regime. Within these plateaus, convective storms of distinctly smaller convective and mesoscales occur sporadically. Satellite data composites across the poleward most margin reveal its sharpness, despite the crude averaging: precipitation doubles within 100 km, marked by both enhancement and deepening of cloudiness. Transported patches and filaments of the moist regime cause consequential precipitation events within and beyond the Tropics. Distinguishing synoptic flows that cross the margin from flows that move the margin is made possible by a novel satellite-based Lagrangian CWV tendency estimate. Climate models do not reliably reproduce the observed bimodal distribution, so studying the moist mode's maintenance processes and the margin-zone air mass transformations, guided by the Lagrangian tendency product, might importantly constrain model moist process treatments.

  10. Provenance of Carboniferous sedimentary rocks in the northern margin of Dabie Mountains, central China and the tectonic significance: constraints from trace elements, mineral chemistry and SHRIMP dating of zircons

    NASA Astrophysics Data System (ADS)

    Li, Renwei; Li, Shuangying; Jin, Fuquan; Wan, Yusheng; Zhang, Shukun

    2004-04-01

    A suite of slightly metamorphosed Carboniferous sedimentary strata occurs in the northern margin of the Dabie Mountains, central China. It consists, in ascending order, of the upper Huayuanqiang Formation (C 1), the Yangshan Formation (C 1), the Daorenchong Formation (C 1-2), the most widely distributed Huyoufang Formation (C 2) and the Yangxiaozhuang Formation (C 2). The provenance of the Carboniferous sedimentary rocks is constrained by the integration of trace elements, detrital mineral chemistry and sensitive high resolution ion microprobe (SHRIMP) dating of detrital zircons, which can help to understand the connection between the provenance and the Paleozoic tectonic evolution of the Qinling-Dabie Orogen. The trace element compositions indicate that the source terrain was probably a continental island arc. Detrital tourmalines were mainly derived from aluminous and Al-poor metapelites and metapsammites, and some are sourced from Li-poor granitoids, pegmatites and aplites. Detrital garnets, found only in the uppermost Huyoufang Formation, are almandine and Mn-almandine garnets, indicating probable sources mainly from garnetiferous schists, and partly from granitoid rocks. The detrital white K-micas are muscovitic in the Huayuanqiang, Daorenchong and Huyoufang Formations, and phengitic with Si contents (p.f.u.) from 3.20 up to max. 3.47-3.53 in the uppermost Huyoufang and the Yangxiaozhuang Formations, a meta-sedimentary source. Major components in the detrital zircon age structure for the Huyoufang Formation range from 506 to 363 Ma, centering on ˜400 and ˜480 Ma, which is characteristic of the Qinling and Erlangping Groups in the Qinling and Tongbai Mountains, central China. Evidently, the major source of the Carboniferous sedimentary rocks in the northern margin of Dabie Mountains was from the southern margin of the Sino-Korean Craton represented by the Qinling and Erlangping Groups. The source area was an island-arc system during the Early Paleozoic that collided with the Sino-Korea plate towards the end of the Early Paleozoic or during the Devonian. A prominent feature in the detrital zircon age structure of the Huyoufang Formation is the Neoproterozoic detritus, which could be derived only from the Yangtze Craton. Reasonable interpretation of the two distinct source materials for the Huyoufang Formation is that the two plates were juxtaposed through collision before the late Carboniferous.

  11. Carboniferous paleogeographic, phytogeographic, and paleoclimatic reconstructions

    USGS Publications Warehouse

    Rowley, D.B.; Raymond, A.; Parrish, Judith T.; Lottes, A.L.; Scotese, C.R.; Ziegler, A.M.

    1985-01-01

    Two revised paleogeographic reconstructions of the Visean and Westphalian C-D stages are presented based on recent paleomagnetic, phytogeographic, stratigraphic, and tectonic data. These data change the positions of some continental blocks, and allow the definition of several new ones. The most important modifications that have been incorporated in these reconstructions are: (1) a proposed isthmus linking North America and Siberia across the Bering Strait; and (2) the separation of China and Southeast Asia in six major blocks, including South China, North China, Shan Thai-Malaya, Indochina, Qangtang, and Tarim blocks. Evidence is presented that suggests that at least the South China, Shan Thai-Malaya, and Qangtang blocks were derived from the northern margin of Gondwana. Multivariate statistical analysis of phytogeographic data from the middle and late Paleozoic allow definition of a number of different phytogeographic units for four time intervals: (1) the Early Devonian, (2) Tournaisian-early Visean, (3) Visean, and (4) late Visean-early Namurian A. Pre-late Visean-early Namurian A floral assemblages from South China show affinities with northern Gondwana floras suggesting a southerly position and provides additional support for our reconstruction of South China against the northern margin of Gondwana. There is a marked decrease in the diversity of phytogeographic units in the Namurian and younger Carboniferous. This correlates closely with the time of assembly of most of Pangaea. The general pattern of Carboniferous phytogeographic units corresponds well with global distribution of continents shown on our paleogeographic reconstructions. In addition, we have constructed paleoclimatic maps for the two Carboniferous time intervals. These maps stress the distribution of rainfall, as this should be strongly correlated with the floras. There is marked change in the rainfall patterns between the Visean and Westphalian C-D. This change corresponds with the closing of the Appalachian-Ouachita ocean between Laurussia and Gondwana, and reflects the removal of a low-latitude moisture source that probably gave rise to monsoonal conditions along the northern margin of Gondwana in the Visean and earlier times. As well, the presence of a substantial heat source at high elevation in the Late Carboniferous significantly influenced the distribution of climatic belts. ?? 1986.

  12. Probability modeling of the number of positive cores in a prostate cancer biopsy session, with applications.

    PubMed

    Serfling, Robert; Ogola, Gerald

    2016-02-10

    Among men, prostate cancer (CaP) is the most common newly diagnosed cancer and the second leading cause of death from cancer. A major issue of very large scale is avoiding both over-treatment and under-treatment of CaP cases. The central challenge is deciding clinical significance or insignificance when the CaP biopsy results are positive but only marginally so. A related concern is deciding how to increase the number of biopsy cores for larger prostates. As a foundation for improved choice of number of cores and improved interpretation of biopsy results, we develop a probability model for the number of positive cores found in a biopsy, given the total number of cores, the volumes of the tumor nodules, and - very importantly - the prostate volume. Also, three applications are carried out: guidelines for the number of cores as a function of prostate volume, decision rules for insignificant versus significant CaP using number of positive cores, and, using prior distributions on total tumor size, Bayesian posterior probabilities for insignificant CaP and posterior median CaP. The model-based results have generality of application, take prostate volume into account, and provide attractive tradeoffs of specificity versus sensitivity. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Misfit and fracture load of implant-supported monolithic crowns in zirconia-reinforced lithium silicate.

    PubMed

    Gomes, Rafael Soares; Souza, Caroline Mathias Carvalho de; Bergamo, Edmara Tatiely Pedroso; Bordin, Dimorvan; Del Bel Cury, Altair Antoninha

    2017-01-01

    In this study, marginal and internal misfit and fracture load with and without thermal-mechanical aging (TMA) of monolithic ZLS and lithium disilicate (LDS) crowns were evaluated. Crowns were milled using a computer-aided design/computer-aided manufacturing system. Marginal gaps (MGs), absolute marginal discrepancy (AMD), axial gaps, and occlusal gaps were measured by X-ray microtomography (n=8). For fracture load testing, crowns were cemented in a universal abutment, and divided into four groups: ZLS without TMA, ZLS with TMA, LDS without TMA, and LDS with TMA (n=10). TMA groups were subjected to 10,000 thermal cycles (5-55°C) and 1,000,000 mechanical cycles (200 N, 3.8 Hz). All groups were subjected to compressive strength testing in a universal testing machine at a crosshead speed of 1 mm/min until failure. Student's t-test was used to examine misfit, two-way analysis of variance was used to analyze fracture load, and Pearson's correlation coefficients for misfit and fracture load were calculated (α=0.05). The materials were analyzed according to Weibull distribution, with 95% confidence intervals. Average MG (p<0.001) and AMD (p=0.003) values were greater in ZLS than in LDS crowns. TMA did not affect the fracture load of either material. However, fracture loads of ZLS crowns were lower than those of LDS crowns (p<0.001). Fracture load was moderately correlated with MG (r=-0.553) and AMD (r=-0.497). ZLS with TMA was least reliable, according to Weibull probability. Within the limitations of this study, ZLS crowns had lower fracture load values and greater marginal misfit than did LDS crowns, although these values were within acceptable limits.

  14. Heterogeneity-induced large deviations in activity and (in some cases) entropy production

    NASA Astrophysics Data System (ADS)

    Gingrich, Todd R.; Vaikuntanathan, Suriyanarayanan; Geissler, Phillip L.

    2014-10-01

    We solve a simple model that supports a dynamic phase transition and show conditions for the existence of the transition. Using methods of large deviation theory we analytically compute the probability distribution for activity and entropy production rates of the trajectories on a large ring with a single heterogeneous link. The corresponding joint rate function demonstrates two dynamical phases—one localized and the other delocalized, but the marginal rate functions do not always exhibit the underlying transition. Symmetries in dynamic order parameters influence the observation of a transition, such that distributions for certain dynamic order parameters need not reveal an underlying dynamical bistability. Solution of our model system furthermore yields the form of the effective Markov transition matrices that generate dynamics in which the two dynamical phases are at coexistence. We discuss the implications of the transition for the response of bacterial cells to antibiotic treatment, arguing that even simple models of a cell cycle lacking an explicit bistability in configuration space will exhibit a bistability of dynamical phases.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mascle, J.; Blarez, E.

    The authors present a marine study of the eastern Ivory Coast-Ghana continental margins which they consider one of the most spectacular extinct transform margins. This margin has been created during Early-Lower Cretaceous time and has not been submitted to any major geodynamic reactivation since its fabric. Based on this example, they propose to consider during the evolution of the transform margin four main and successive stages. Shearing contact is first active between two probably thick continental crusts and then between progressively thinning continental crusts. This leads to the creation of specific geological structures such as pull-apart graben, elongated fault lineaments,more » major fault scarps, shear folds, and marginal ridges. After the final continental breakup, a hot center (the mid-oceanic ridge axis) is progressively drifting along the newly created margin. The contact between two lithospheres of different nature should necessarily induce, by thermal exchanges, vertical crustal readjustments. Finally, the transform margin remains directly adjacent to a hot but cooling oceanic lithosphere; its subsidence behavior should then progressively be comparable to the thermal subsidence of classic rifted margins.« less

  16. L1-norm locally linear representation regularization multi-source adaptation learning.

    PubMed

    Tao, Jianwen; Wen, Shiting; Hu, Wenjun

    2015-09-01

    In most supervised domain adaptation learning (DAL) tasks, one has access only to a small number of labeled examples from target domain. Therefore the success of supervised DAL in this "small sample" regime needs the effective utilization of the large amounts of unlabeled data to extract information that is useful for generalization. Toward this end, we here use the geometric intuition of manifold assumption to extend the established frameworks in existing model-based DAL methods for function learning by incorporating additional information about the target geometric structure of the marginal distribution. We would like to ensure that the solution is smooth with respect to both the ambient space and the target marginal distribution. In doing this, we propose a novel L1-norm locally linear representation regularization multi-source adaptation learning framework which exploits the geometry of the probability distribution, which has two techniques. Firstly, an L1-norm locally linear representation method is presented for robust graph construction by replacing the L2-norm reconstruction measure in LLE with L1-norm one, which is termed as L1-LLR for short. Secondly, considering the robust graph regularization, we replace traditional graph Laplacian regularization with our new L1-LLR graph Laplacian regularization and therefore construct new graph-based semi-supervised learning framework with multi-source adaptation constraint, which is coined as L1-MSAL method. Moreover, to deal with the nonlinear learning problem, we also generalize the L1-MSAL method by mapping the input data points from the input space to a high-dimensional reproducing kernel Hilbert space (RKHS) via a nonlinear mapping. Promising experimental results have been obtained on several real-world datasets such as face, visual video and object. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Evaluating the Potential of Marginal Land for Cellulosic Feedstock Production and Carbon Sequestration in the United States

    DOE PAGES

    Emery, Isaac; Mueller, Steffen; Qin, Zhangcai; ...

    2016-12-01

    Land availability for growing feedstocks at scale is a crucial concern for the bioenergy industry. Feedstock production on land not well-suited to growing conventional crops, or marginal land, is often promoted as ideal, although there is a poor understanding of the qualities, quantity, and distribution of marginal lands in the United States. In this paper, we examine the spatial distribution of land complying with several key marginal land definitions at the United States county, agro-ecological zone, and national scales, and compare the ability of both marginal land and land cover data sets to identify regions for feedstock production. We concludemore » that very few land parcels comply with multiple definitions of marginal land. Furthermore, to examine possible carbon-flow implications of feedstock production on land that could be considered marginal per multiple definitions, we model soil carbon changes upon transitions from marginal cropland, grassland, and cropland–pastureland to switchgrass production for three marginal land-rich counties. Finally, our findings suggest that total soil organic carbon changes per county are small, and generally positive, and can influence life-cycle greenhouse gas emissions of switchgrass ethanol.« less

  18. Evaluating the Potential of Marginal Land for Cellulosic Feedstock Production and Carbon Sequestration in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, Isaac; Mueller, Steffen; Qin, Zhangcai

    Land availability for growing feedstocks at scale is a crucial concern for the bioenergy industry. Feedstock production on land not well-suited to growing conventional crops, or marginal land, is often promoted as ideal, although there is a poor understanding of the qualities, quantity, and distribution of marginal lands in the United States. In this paper, we examine the spatial distribution of land complying with several key marginal land definitions at the United States county, agro-ecological zone, and national scales, and compare the ability of both marginal land and land cover data sets to identify regions for feedstock production. We concludemore » that very few land parcels comply with multiple definitions of marginal land. Furthermore, to examine possible carbon-flow implications of feedstock production on land that could be considered marginal per multiple definitions, we model soil carbon changes upon transitions from marginal cropland, grassland, and cropland–pastureland to switchgrass production for three marginal land-rich counties. Finally, our findings suggest that total soil organic carbon changes per county are small, and generally positive, and can influence life-cycle greenhouse gas emissions of switchgrass ethanol.« less

  19. Bayesian Networks for enterprise risk assessment

    NASA Astrophysics Data System (ADS)

    Bonafede, C. E.; Giudici, P.

    2007-08-01

    According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. Risk, in general, is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (either qualitative or quantitative data) are used. Moreover, qualitative data must be converted in numerical values or bounds to be used in the model. In the case of enterprise risk assessment the considered risks are, for instance, strategic, operational, legal and of image, which many times are difficult to be quantified. So in most cases only expert data, gathered by scorecard approaches, are available for risk analysis. The Bayesian Networks (BNs) are a useful tool to integrate different information and in particular to study the risk's joint distribution by using data collected from experts. In this paper we want to show a possible approach for building a BN in the particular case in which only prior probabilities of node states and marginal correlations between nodes are available, and when the variables have only two states.

  20. The influence of coordinated defects on inhomogeneous broadening in cubic lattices

    NASA Astrophysics Data System (ADS)

    Matheson, P. L.; Sullivan, Francis P.; Evenson, William E.

    2016-12-01

    The joint probability distribution function (JPDF) of electric field gradient (EFG) tensor components in cubic materials is dominated by coordinated pairings of defects in shells near probe nuclei. The contributions from these inner shell combinations and their surrounding structures contain the essential physics that determine the PAC-relevant quantities derived from them. The JPDF can be used to predict the nature of inhomogeneous broadening (IHB) in perturbed angular correlation (PAC) experiments by modeling the G 2 spectrum and finding expectation values for V zz and η. The ease with which this can be done depends upon the representation of the JPDF. Expanding on an earlier work by Czjzek et al. (Hyperfine Interact. 14, 189-194, 1983), Evenson et al. (Hyperfine Interact. 237, 119, 2016) provide a set of coordinates constructed from the EFG tensor invariants they named W 1 and W 2. Using this parameterization, the JPDF in cubic structures was constructed using a point charge model in which a single trapped defect (TD) is the nearest neighbor to a probe nucleus. Individual defects on nearby lattice sites pair with the TD to provide a locus of points in the W 1- W 2 plane around which an amorphous-like distribution of probability density grows. Interestingly, however, marginal, separable PDFs appear adequate to model IHB relevant cases. We present cases from simulations in cubic materials illustrating the importance of these near-shell coordinations.

  1. History of gas production from Devonian shale in eastern Kentucky

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kemper, J.R.; Frankie, W.T.; Smath, R.A.

    More than 10,500 wells that penetrate the Devonian shale have been compiled into a data base covering a 25-county area of eastern Kentucky. This area includes the Big Sandy gas field, the largest in the Appalachian basin, and marginal areas to the southwest, west, and northwest. The development of the Big Sandy gas field began in the 1920s in western Floyd County, Kentucky, and moved concentrically outward through 1970. Since 1971, the trend has been for infill and marginal drilling, and fewer companies have been involved. The resulting outline of the Big Sandy gas field covers most of Letcher, Knott,more » Floyd, Martin, and Pike Counties in Kentucky; it also extends into West Virginia. Outside the Big Sandy gas field, exploration for gas has been inconsistent, with a much higher ratio of dry holes. The results of this study, which was partially supported by the Gas Research Institute (GRI), indicate that certain geologic factors, such as fracture size and spacing, probably determine the distribution of commercial gas reserves as well as the outline of the Big Sandy gas field. Future Big Sandy infill and extension drilling will need to be based on an understanding of these factors.« less

  2. Vertical distribution of major, minor and trace elements in sediments from mud volcanoes of the Gulf of Cadiz: evidence of Cd, As and Ba fronts in upper layers

    NASA Astrophysics Data System (ADS)

    Carvalho, Lina; Monteiro, Rui; Figueira, Paula; Mieiro, Cláudia; Almeida, Joana; Pereira, Eduarda; Magalhães, Vítor; Pinheiro, Luís; Vale, Carlos

    2018-01-01

    Mud volcanoes are feature of the coastal margins where anaerobic oxidation of methane triggers geochemical signals. Elemental composition, percentage of fine particles and loss on ignition were determined in sediment layers of eleven gravity cores retrieved from four mud volcanoes (Sagres, Bonjardim, Soloviev and Porto) and three undefined structures located on the deep Portuguese margin of the Gulf of Cadiz. Calcium was positively correlated to Sr and inversely to Al as well as to most of the trace elements. Vertical profiles of Ba, Cd and As concentrations, and their ratios to Al, in Porto and Soloviev showed pronounced enhancements in the top 50-cm depth. Sub-surface enhancements were less pronounced in other mud volcanoes and were absent in sediments from the structures. These profiles were interpreted as diagenetic enrichments related to the anaerobic oxidation of methane originated from upward methane-rich fluxes. The observed barium fronts were most likely caused by the presence of barite which precipitated at the sulphate-methane transition zone. Cd and As enrichments have probably resulted from successive dissolution/precipitation of sulphides in response to vertical shifts of redox boundaries.

  3. Improving experimental phases for strong reflections prior to density modification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uervirojnangkoorn, Monarin; University of Lübeck, Ratzeburger Allee 160, 23538 Lübeck; Hilgenfeld, Rolf, E-mail: hilgenfeld@biochem.uni-luebeck.de

    A genetic algorithm has been developed to optimize the phases of the strongest reflections in SIR/SAD data. This is shown to facilitate density modification and model building in several test cases. Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the mapsmore » can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005 ▶), Acta Cryst. D61, 899–902], the impact of identifying optimized phases for a small number of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. A computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less

  4. Hydrologic risk analysis in the Yangtze River basin through coupling Gaussian mixtures into copulas

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Li, Y. P.; Huang, K.; Li, Z.

    2016-02-01

    In this study, a bivariate hydrologic risk framework is proposed through coupling Gaussian mixtures into copulas, leading to a coupled GMM-copula method. In the coupled GMM-Copula method, the marginal distributions of flood peak, volume and duration are quantified through Gaussian mixture models and the joint probability distributions of flood peak-volume, peak-duration and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period of flood variable pairs. The proposed method is applied to the risk analysis for the Yichang station on the main stream of the Yangtze River, China. The results indicate that (i) the bivariate risk for flood peak-volume would keep constant for the flood volume less than 1.0 × 105 m3/s day, but present a significant decreasing trend for the flood volume larger than 1.7 × 105 m3/s day; and (ii) the bivariate risk for flood peak-duration would not change significantly for the flood duration less than 8 days, and then decrease significantly as duration value become larger. The probability density functions (pdfs) of the flood volume and duration conditional on flood peak can also be generated through the fitted copulas. The results indicate that the conditional pdfs of flood volume and duration follow bimodal distributions, with the occurrence frequency of the first vertex decreasing and the latter one increasing as the increase of flood peak. The obtained conclusions from the bivariate hydrologic analysis can provide decision support for flood control and mitigation.

  5. IGM CONSTRAINTS FROM THE SDSS-III/BOSS DR9 Lyα FOREST TRANSMISSION PROBABILITY DISTRIBUTION FUNCTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Khee-Gan; Hennawi, Joseph F.; Spergel, David N.

    2015-02-01

    The Lyα forest transmission probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the transmission PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from Sloan Digital Sky Survey Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS transmission PDFs, measured at (z) = [2.3, 2.6, 3.0], are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, γ, and temperature at mean density,more » T {sub 0}, where T(Δ) = T {sub 0}Δ{sup γ} {sup –} {sup 1}. We find that a significant population of partial Lyman-limit systems (LLSs) with a column-density distribution slope of β{sub pLLS} ∼ – 2 are required to explain the data at the low-transmission end of transmission PDF, while uncertainties in the mean Lyα forest transmission affect the high-transmission end. After modeling the LLSs and marginalizing over mean transmission uncertainties, we find that γ = 1.6 best describes the data over our entire redshift range, although constraints on T {sub 0} are affected by systematic uncertainties. Within our model framework, isothermal or inverted temperature-density relationships (γ ≤ 1) are disfavored at a significance of over 4σ, although this could be somewhat weakened by cosmological and astrophysical uncertainties that we did not model.« less

  6. Improved cosmological constraints on the curvature and equation of state of dark energy

    NASA Astrophysics Data System (ADS)

    Pan, Nana; Gong, Yungui; Chen, Yun; Zhu, Zong-Hong

    2010-08-01

    We apply the Constitution compilation of 397 supernova Ia, the baryon acoustic oscillation measurements including the A parameter, the distance ratio and the radial data, the five-year Wilkinson microwave anisotropy probe and the Hubble parameter data to study the geometry of the Universe and the property of dark energy by using the popular Chevallier-Polarski-Linder and Jassal-Bagla-Padmanabhan parameterizations. We compare the simple χ2 method of joined contour estimation and the Monte Carlo Markov chain method, and find that it is necessary to make the marginalized analysis on the error estimation. The probabilities of Ωk and wa in the Chevallier-Polarski-Linder model are skew distributions, and the marginalized 1σ errors are Ωm = 0.279+0.015- 0.008, Ωk = 0.005+0.006- 0.011, w0 = -1.05+0.23- 0.06 and wa = 0.5+0.3- 1.5. For the Jassal-Bagla-Padmanabhan model, the marginalized 1σ errors are Ωm = 0.281+0.015- 0.01, Ωk = 0.000+0.007- 0.006, w0 = -0.96+0.25- 0.18 and wa = -0.6+1.9- 1.6. The equation of state parameter w(z) of dark energy is negative in the redshift range 0 <= z <= 2 at more than 3σ level. The flat ΛCDM model is consistent with the current observational data at the 1σ level.

  7. Early Paleozoic paleogeography of the northern Gondwana margin: new evidence for Ordovician-Silurian glaciation

    NASA Astrophysics Data System (ADS)

    Semtner, A.-K.; Klitzsch, E.

    1994-12-01

    During the Early Paleozoic, transgressions and the distribution of sedimentary facies on the northern Gondwana margin were controlled by a regional NNW-SSE to almost north-south striking structural relief. In Early Silurian times, a eustatic highstand enabled the sea to reach its maximum southward extent. The counterclockwise rotation of Gondwana during the Cambrian and Early Ordovician caused the northern Gondwana margin to shift from intertropical to southern polar latitudes in Ordovician times. Glacial and periglacial deposits are reported from many localities in Morocco, Algeria, Niger, Libya, Chad, Sudan, Jordan and Saudi Arabia. The Late Ordovician glaciation phase was followed by a period of a major glacioeustatic sea-level rise in the Early Silurian due to the retreat of the ice-cap. As a consequence of the decreasing water circulation in the basin centers (Central Arabia, Murzuk- and Ghadames basins), highly bituminous euxinic shales were deposited. These shales are considered to be the main source rock of Paleozoic oil and gas deposits in parts of Saudi Arabia, Libya and Algeria. The following regression in the southern parts of the Early Silurian sea was probably caused by a second glacial advance, which was mainly restricted to areas in Chad, Sudan and Niger. Evidence for glacial activity and fluvioglacial sedimentation is available from rocks overlying the basal Silurian shale in north-east Chad and north-west Sudan. The Early Silurian ice advance is considered to be responsible for the termination of euxinic shale deposition in the basin centers.

  8. An integrated geophysical study on the Mesozoic strata distribution and hydrocarbon potential in the South China Sea

    NASA Astrophysics Data System (ADS)

    Hu, Weijian; Hao, Tianyao; Jiang, Weiwei; Xu, Ya; Zhao, Baimin; Jiang, Didi

    2015-11-01

    A series of drilling, dredge, and seismic investigations indicate that Mesozoic sediments exist in the South China Sea (SCS) which shows a bright prospect for oil and gas exploration. In order to study the distribution of Mesozoic strata and their residual thicknesses in the SCS, we carried out an integrated geophysical study based mainly on gravity data, gravity basement depth and distribution of residual Mesozoic thickness in the SCS were obtained using gravity inversion constrained with high-precision drilling and seismic data. In addition, the fine deep crustal structures and distribution characteristics of Mesozoic thicknesses of three typical profiles were obtained by gravity fitting inversion. Mesozoic strata in the SCS are mainly distributed in the south and north continental margins, and have been reformed by the later tectonic activities. They extend in NE-trending stripes are macro-controlled by the deep and large NE-trending faults, and cut by the NW-trending faults which were active in later times. The offset in NW direction of Mesozoic strata in Nansha area of the southern margin are more obvious as compared to the north margin. In the Pearl River Mouth Basin and Southwest Taiwan Basin of the north continental margin the Mesozoic sediments are continuously distributed with a relatively large thickness. In the Nansha area of the south margin the Mesozoic strata are discontinuous and their thicknesses vary considerably. According to the characteristics of Mesozoic thickness distribution and hydrocarbon potential analyses from drilling and other data, Dongsha Uplift-Chaoshan Depression, Southwest Taiwan Basin-Peikang Uplift and Liyue Bank have large thickness of the Mesozoic residual strata, have good hydrocarbon genesis capability and complete source-reservoir-cap combinations, show a bright prospect of Mesozoic oil/gas resources.

  9. On the existence of black holes in distorted Schwarzschild spacetime using marginally trapped surfaces

    NASA Astrophysics Data System (ADS)

    Pilkington, Terry

    The classical definition of a black hole in terms of an event horizon relies on global properties of the spacetime. Realistic black holes have matter distributions surrounding them, which negates the asymptotic flatness needed for an event horizon. Using the (quasi-)local concept of marginally trapped surfaces, we investigate the Schwarzschild spacetime distorted by an axisymmetric matter distribution. We determine that it is possible to locate a future outer trapping horizon for a given foliation within certain value ranges of multipole moments. Furthermore, we show that there are no marginally trapped surfaces for arbitrary values of the multipole moment magnitudes. KEYWORDS: SCHWARZSCHILD; BLACK HOLE; DISTORTED SPACETIME; MARGINALLY TRAPPED SURFACE; FUTURE OUTER TRAPPING HORIZON

  10. Stochastic investigation of temperature process for climatic variability identification

    NASA Astrophysics Data System (ADS)

    Lerias, Eleutherios; Kalamioti, Anna; Dimitriadis, Panayiotis; Markonis, Yannis; Iliopoulou, Theano; Koutsoyiannis, Demetris

    2016-04-01

    The temperature process is considered as the most characteristic hydrometeorological process and has been thoroughly examined in the climate-change framework. We use a dataset comprising hourly temperature and dew point records to identify statistical variability with emphasis on the last period. Specifically, we investigate the occurrence of mean, maximum and minimum values and we estimate statistical properties such as marginal probability distribution function and the type of decay of the climacogram (i.e., mean process variance vs. scale) for various time periods. Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.

  11. Estimation of synthetic flood design hydrographs using a distributed rainfall-runoff model coupled with a copula-based single storm rainfall generator

    NASA Astrophysics Data System (ADS)

    Candela, A.; Brigandì, G.; Aronica, G. T.

    2014-07-01

    In this paper a procedure to derive synthetic flood design hydrographs (SFDH) using a bivariate representation of rainfall forcing (rainfall duration and intensity) via copulas, which describes and models the correlation between two variables independently of the marginal laws involved, coupled with a distributed rainfall-runoff model, is presented. Rainfall-runoff modelling (R-R modelling) for estimating the hydrological response at the outlet of a catchment was performed by using a conceptual fully distributed procedure based on the Soil Conservation Service - Curve Number method as an excess rainfall model and on a distributed unit hydrograph with climatic dependencies for the flow routing. Travel time computation, based on the distributed unit hydrograph definition, was performed by implementing a procedure based on flow paths, determined from a digital elevation model (DEM) and roughness parameters obtained from distributed geographical information. In order to estimate the primary return period of the SFDH, which provides the probability of occurrence of a hydrograph flood, peaks and flow volumes obtained through R-R modelling were treated statistically using copulas. Finally, the shapes of hydrographs have been generated on the basis of historically significant flood events, via cluster analysis. An application of the procedure described above has been carried out and results presented for the case study of the Imera catchment in Sicily, Italy.

  12. Temperature drives abundance fluctuations, but spatial dynamics is constrained by landscape configuration: Implications for climate-driven range shift in a butterfly.

    PubMed

    Fourcade, Yoan; Ranius, Thomas; Öckinger, Erik

    2017-10-01

    Prediction of species distributions in an altered climate requires knowledge on how global- and local-scale factors interact to limit their current distributions. Such knowledge can be gained through studies of spatial population dynamics at climatic range margins. Here, using a butterfly (Pyrgus armoricanus) as model species, we first predicted based on species distribution modelling that its climatically suitable habitats currently extend north of its realized range. Projecting the model into scenarios of future climate, we showed that the distribution of climatically suitable habitats may shift northward by an additional 400 km in the future. Second, we used a 13-year monitoring dataset including the majority of all habitat patches at the species northern range margin to assess the synergetic impact of temperature fluctuations and spatial distribution of habitat, microclimatic conditions and habitat quality, on abundance and colonization-extinction dynamics. The fluctuation in abundance between years was almost entirely determined by the variation in temperature during the species larval development. In contrast, colonization and extinction dynamics were better explained by patch area, between-patch connectivity and host plant density. This suggests that the response of the species to future climate change may be limited by future land use and how its host plants respond to climate change. It is, thus, probable that dispersal limitation will prevent P. armoricanus from reaching its potential future distribution. We argue that models of range dynamics should consider the factors influencing metapopulation dynamics, especially at the range edges, and not only broad-scale climate. It includes factors acting at the scale of habitat patches such as habitat quality and microclimate and landscape-scale factors such as the spatial configuration of potentially suitable patches. Knowledge of population dynamics under various environmental conditions, and the incorporation of realistic scenarios of future land use, appears essential to provide predictions useful for actions mitigating the negative effects of climate change. © 2017 The Authors. Journal of Animal Ecology © 2017 British Ecological Society.

  13. Geologic map of the Strawberry Butte 7.5’ quadrangle, Meagher County, Montana

    USGS Publications Warehouse

    Reynolds, Mitchell W.; Brandt, Theodore R.

    2017-06-19

    The 7.5′ Strawberry Butte quadrangle in Meagher County, Montana near the southwest margin of the Little Belt Mountains, encompasses two sharply different geologic terranes.  The northern three-quarters of the quadrangle are underlain mainly by Paleoproterozoic granite gneiss, across which Middle Cambrian sedimentary rocks rest unconformably.  An ancestral valley of probable late Eocene age, eroded northwest across the granite gneiss terrane, is filled with Oligocene basalt and overlying Miocene and Oligocene sandstone, siltstone, tuffaceous siltstone, and conglomerate.  The southern quarter of the quadrangle is underlain principally by deformed Mesoproterozoic sedimentary rocks of the Newland Formation, which are intruded by Eocene biotite hornblende dacite dikes.  In this southern terrane, Tertiary strata are exposed only in a limited area near the southeast margin of the quadrangle.  The distinct terranes are juxtaposed along the Volcano Valley fault zone—a zone of recurrent crustal movement beginning possibly in Mesoproterozoic time and certainly established from Neoproterozoic–Early Cambrian to late Tertiary time.  Movement along the fault zone has included normal faulting, the southern terrane faulted down relative to the northern terrane, some reverse faulting as the southern terrane later moved up against the northern terrane, and lateral movement during which the southern terrane likely moved west relative to the northern terrane.  Near the eastern margin of the quadrangle, the Newland Formation is locally the host of stratabound sulfide mineralization adjacent to the fault zone; west along the fault zone across the remainder of the quadrangle are significant areas and bands of hematite and iron-silicate mineral concentrations related to apparent alteration of iron sulfides.  The map defines the distribution of a variety of surficial deposits, including the distribution of hematite-rich colluvium and iron-silicate boulders.  The southeast corner of the quadrangle is the site of active exploration and potential development for copper from the sulfide-bearing strata of the Newland Formation.

  14. Positive margins prediction in breast cancer conservative surgery: Assessment of a preoperative web-based nomogram.

    PubMed

    Alves-Ribeiro, Lídia; Osório, Fernando; Amendoeira, Isabel; Fougo, José Luís

    2016-08-01

    Margin status of the surgical specimen has been shown to be a prognostic and risk factor for local recurrence in breast cancer surgery. It has been studied as a topic of intervention to diminish reoperation rates and reduce the probability of local recurrence in breast conservative surgery (BCS). This study aims to validate the Dutch BreastConservation! nomogram, created by Pleijhus et al., which predicts preoperative probability of positive margins in BCS. Patients with diagnosis of breast cancer stages cT1-2, who underwent BCS at the Breast Center of São João University Hospital (BC-CHSJ) in 2013-2014, were included. Association and correlation were evaluated for clinical, radiological, pathological and surgical variables. Multivariable logistic regression and ROC curves were used to assess nomogram parameters and discrimination. In our series of 253 patients, no associations were found between margin status and other studied variables (such as age or family history of breast cancer), except for weight (p-value = 0.045) and volume (p-value = 0.012) of the surgical specimen. Regarding the nomogram, a statistically significant association was shown between cN1 status and positive margins (p-value = 0.014). No differences were registered between the scores of patients with positive versus negative margins. Discrimination analysis showed an AUC of 0.474 for the basic and 0.508 for the expanded models. We cannot assume its external validation or its applicability to our cohort. Further studies are needed to determine the validity of this nomogram and achieve a broader view of currently available tools. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Estimation of the Nonlinear Random Coefficient Model when Some Random Effects Are Separable

    ERIC Educational Resources Information Center

    du Toit, Stephen H. C.; Cudeck, Robert

    2009-01-01

    A method is presented for marginal maximum likelihood estimation of the nonlinear random coefficient model when the response function has some linear parameters. This is done by writing the marginal distribution of the repeated measures as a conditional distribution of the response given the nonlinear random effects. The resulting distribution…

  16. US refining margin trend: austerity continues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Should crude oil prices hold near current levels in 1988, US refining margins might improve little, if at all. If crude oil prices rise, margins could blush pink or worse. If they drop, US refiners would still probably not see much margin improvement. In fact, if crude prices fall, they could set off another free fall in products markets and threaten refiner survival. Volatility in refined products markets and low product demand growth are the underlying reasons for caution or pessimism as the new year approaches. Recent directional patterns in refining margins are scrutinized in this issue. This issue alsomore » contains the following: (1) the ED refining netback data for the US Gulf and West Coasts, Rotterdam, and Singapore for late November, 1987; and (2) the ED fuel price/tax series for countries of the Eastern Hemisphere, November, 1987 edition. 4 figures, 6 tables.« less

  17. A Regional View of the Margin: Salmonid Abundance and Distribution in the Southern Appalachian Mountains of North Carolina and Virginia

    Treesearch

    Patricia A. Flebbe

    1994-01-01

    In the southern Appalachian Mountains, native brook trout Salvelinus fontinalis and introduced rainbow trout Oncorhynchus mykiss and brown trout Salmo trutta are at the southern extremes of their distributions, an often overlooked kind of marginal habitat. At a regional scale composed of the states of Virginia...

  18. Bayesian models for cost-effectiveness analysis in the presence of structural zero costs

    PubMed Central

    Baio, Gianluca

    2014-01-01

    Bayesian modelling for cost-effectiveness data has received much attention in both the health economics and the statistical literature, in recent years. Cost-effectiveness data are characterised by a relatively complex structure of relationships linking a suitable measure of clinical benefit (e.g. quality-adjusted life years) and the associated costs. Simplifying assumptions, such as (bivariate) normality of the underlying distributions, are usually not granted, particularly for the cost variable, which is characterised by markedly skewed distributions. In addition, individual-level data sets are often characterised by the presence of structural zeros in the cost variable. Hurdle models can be used to account for the presence of excess zeros in a distribution and have been applied in the context of cost data. We extend their application to cost-effectiveness data, defining a full Bayesian specification, which consists of a model for the individual probability of null costs, a marginal model for the costs and a conditional model for the measure of effectiveness (given the observed costs). We presented the model using a working example to describe its main features. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:24343868

  19. Bayesian models for cost-effectiveness analysis in the presence of structural zero costs.

    PubMed

    Baio, Gianluca

    2014-05-20

    Bayesian modelling for cost-effectiveness data has received much attention in both the health economics and the statistical literature, in recent years. Cost-effectiveness data are characterised by a relatively complex structure of relationships linking a suitable measure of clinical benefit (e.g. quality-adjusted life years) and the associated costs. Simplifying assumptions, such as (bivariate) normality of the underlying distributions, are usually not granted, particularly for the cost variable, which is characterised by markedly skewed distributions. In addition, individual-level data sets are often characterised by the presence of structural zeros in the cost variable. Hurdle models can be used to account for the presence of excess zeros in a distribution and have been applied in the context of cost data. We extend their application to cost-effectiveness data, defining a full Bayesian specification, which consists of a model for the individual probability of null costs, a marginal model for the costs and a conditional model for the measure of effectiveness (given the observed costs). We presented the model using a working example to describe its main features. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  20. Management of Water Quantity and Quality Based on Copula for a Tributary to Miyun Reservoir, Beijing

    NASA Astrophysics Data System (ADS)

    Zang, N.; Wang, X.; Liang, P.

    2017-12-01

    Due to the complex mutual influence between water quantity and water quality of river, it is difficult to reflect the actual characters of the tributaries to reservoir. In this study, the acceptable marginal probability distributions for water quantity and quality of reservoir inflow were calculated. A bivariate Archimedean copula was further applied to establish the joint distribution function of them. Then multiple combination scenarios of water quantity and water quality were designed to analyze their coexistence relationship and reservoir management strategies. Taking Bai river, an important tributary into the Miyun Reservoir, as a study case. The results showed that it is feasible to apply Frank copula function to describe the jointed distribution function of water quality and water quantity for Bai river. Furthermore, the monitoring of TP concentration needs to be strengthen in Bai river. This methodology can be extended to larger dimensions and is transferable to other reservoirs via establishment of models with relevant data for a particular area. Our findings help better analyzing the coexistence relationship and influence degree of the water quantity and quality of the tributary to reservoir for the purpose of water resources protection.

  1. Seismic evidence of Messinian salt in opposite margins of West Mediterranean

    NASA Astrophysics Data System (ADS)

    Mocnik, Arianna; Camerlenghi, Angelo; Del Ben, Anna; Geletti, Riccardo; Wardell, Nigel; Zgur, Fabrizio

    2015-04-01

    The post drift Messinian Salinity Crisis (MSC) affected the whole Mediterranean basin, with deposition of evaporitic sequences in the deep basins, in the lower continental slopes, and in several shallower marginal basins; usually, in the continental margins, the MSC originated noticeable erosional truncations that locally cause important hiatuses in the pre-Messinian sequences, covered by the Plio-Quaternary sediments. In this work we focus on the MSC seismic signature of two new seismic datasets acquired in 2010 (West Sardinia offshore) and in 2012 (within the Eurofleet project SALTFLU in the South Balearic continental margin and the northern Algero abyssal plain). The "Messinian trilogy" recognized in the West-Mediterranean abyssal plain, is characterized by different seismic facies: the Lower evaporite Unit (LU), the salt Mobile Unit (MU) and the Upper evaporite mainly gypsiferous Unit (UU). Both seismic datasets show the presence of the Messinian trilogy also if the LU is not always clearly interpretable due to the strong seismic signal absorption by the halite layers; the salt thickness of the MU is similar in both the basins as also the thickness and stratigraphy of the UU. The Upper Unit (UU) is made up of a well reflecting package of about 10 reflectors, partially deformed by salt tectonic and characterized by a thin transparent layer that we interpreted as salt sequence inner the shallower part of the UU. Below the stratified UU, the MU exhibits a transparent layer in the deep basin and also on the foot of the slope, where a negative reflector, related to the high interval velocity of salt, marks its base. The halokinetic processes are not homogeneously distributed in the region, forming a great number of diapirs on the foot of the slope (due to the pression of the slided sediments) and giant domes toward the deep basin (due to the higher thickness of the Plio-quaternary sediments). This distribution seems to be related to the amount of salt and of the sedimentary cover. During the MSC the margins of the West Mediterranean Sea seem to be involved in some tectonic events probably connected to reactivation of normal faults and to the fast variation of the water load related to sea level fluctuations. The absence of calibrating boreholes in the deep Mediterranean basins and the hard penetration of seismic energy below the evaporitic layers, represent a limit for the knowledge of the geological evolution of the basins; the interpretation of the presented datasets could be a contribution to the comprehension of the evaporitic deposition and early-stage salt deformation during the MSC in the Mediterranean sea.

  2. Corneal inflammatory events with daily silicone hydrogel lens wear.

    PubMed

    Szczotka-Flynn, Loretta; Jiang, Ying; Raghupathy, Sangeetha; Bielefeld, Roger A; Garvey, Matthew T; Jacobs, Michael R; Kern, Jami; Debanne, Sara M

    2014-01-01

    This study aimed to determine the probability and risk factors for developing a corneal inflammatory event (CIE) during daily wear of lotrafilcon A silicone hydrogel contact lenses. Eligible participants (n = 218) were fit with lotrafilcon A lenses for daily wear and followed up for 12 months. Participants were randomized to either a polyhexamethylene biguanide-preserved multipurpose solution or a one-step peroxide disinfection system. The main exposures of interest were bacterial contamination of lenses, cases, lid margins, and ocular surface. Kaplan-Meier (KM) plots were used to estimate the cumulative unadjusted probability of remaining free from a CIE, and multivariate Cox proportional hazards regression was used to model the hazard of experiencing a CIE. The KM unadjusted cumulative probability of remaining free from a CIE for both lens care groups combined was 92.3% (95% confidence interval [CI], 88.1 to 96.5%). There was one participant with microbial keratitis, five participants with asymptomatic infiltrates, and seven participants with contact lens peripheral ulcers, providing KM survival estimates of 92.8% (95% CI, 88.6 to 96.9%) and 98.1% (95% CI, 95.8 to 100.0%) for remaining free from noninfectious and symptomatic CIEs, respectively. The presence of substantial (>100 colony-forming units) coagulase-negative staphylococci bioburden on lid margins was associated with about a five-fold increased risk for the development of a CIE (p = 0.04). The probability of experiencing a CIE during daily wear of lotrafilcon A contact lenses is low, and symptomatic CIEs are rare. Patient factors, such as high levels of bacterial bioburden on lid margins, contribute to the development of noninfectious CIEs during daily wear of silicone hydrogel lenses.

  3. A Bayesian Surrogate for Regional Skew in Flood Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Kuczera, George

    1983-06-01

    The problem of how to best utilize site and regional flood data to infer the shape parameter of a flood distribution is considered. One approach to this problem is given in Bulletin 17B of the U.S. Water Resources Council (1981) for the log-Pearson distribution. Here a lesser known distribution is considered, namely, the power normal which fits flood data as well as the log-Pearson and has a shape parameter denoted by λ derived from a Box-Cox power transformation. The problem of regionalizing λ is considered from an empirical Bayes perspective where site and regional flood data are used to infer λ. The distortive effects of spatial correlation and heterogeneity of site sampling variance of λ are explicitly studied with spatial correlation being found to be of secondary importance. The end product of this analysis is the posterior distribution of the power normal parameters expressing, in probabilistic terms, what is known about the parameters given site flood data and regional information on λ. This distribution can be used to provide the designer with several types of information. The posterior distribution of the T-year flood is derived. The effect of nonlinearity in λ on inference is illustrated. Because uncertainty in λ is explicitly allowed for, the understatement in confidence limits due to fixing λ (analogous to fixing log skew) is avoided. Finally, it is shown how to obtain the marginal flood distribution which can be used to select a design flood with specified exceedance probability.

  4. Insights into the dynamics of planetary interiors obtained through the study of global distribution of volcanoes I: Empirical calibration on Earth

    NASA Astrophysics Data System (ADS)

    Cañon-Tapia, Edgardo; Mendoza-Borunda, Ramón

    2014-06-01

    The distribution of volcanic features is ultimately controlled by processes taking place beneath the surface of a planet. For this reason, characterization of volcano distribution at a global scale can be used to obtain insights concerning dynamic aspects of planetary interiors. Until present, studies of this type have focused on volcanic features of a specific type, or have concentrated on relatively small regions. In this paper, (the first of a series of three papers) we describe the distribution of volcanic features observed over the entire surface of the Earth, combining an extensive database of submarine and subaerial volcanoes. The analysis is based on spatial density contours obtained with the Fisher kernel. Based on an empirical approach that makes no a priori assumptions concerning the number of modes that should characterize the density distribution of volcanism we identified the most significant modes. Using those modes as a base, the relevant distance for the formation of clusters of volcanoes is constrained to be on the order of 100 to 200 km. In addition, it is noted that the most significant modes lead to the identification of clusters that outline the most important tectonic margins on Earth without the need of making any ad hoc assumptions. Consequently, we suggest that this method has the potential of yielding insights about the probable occurrence of tectonic features within other planets.

  5. Locational Marginal Pricing in the Campus Power System at the Power Distribution Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, Jun; Gu, Yi; Zhang, Yingchen

    2016-11-14

    In the development of smart grid at distribution level, the realization of real-time nodal pricing is one of the key challenges. The research work in this paper implements and studies the methodology of locational marginal pricing at distribution level based on a real-world distribution power system. The pricing mechanism utilizes optimal power flow to calculate the corresponding distributional nodal prices. Both Direct Current Optimal Power Flow and Alternate Current Optimal Power Flow are utilized to calculate and analyze the nodal prices. The University of Denver campus power grid is used as the power distribution system test bed to demonstrate themore » pricing methodology.« less

  6. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  7. Post-processing of multi-model ensemble river discharge forecasts using censored EMOS

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2014-05-01

    When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a daily basis over a period of three years. For the two catchments considered, this resulted in well calibrated and sharp forecast distributions over all lead-times from 1 to 114 h. Training observations tended to be better indicators for the dependence structure than the raw ensemble.

  8. Estimating Model Probabilities using Thermodynamic Markov Chain Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Ye, M.; Liu, P.; Beerli, P.; Lu, D.; Hill, M. C.

    2014-12-01

    Markov chain Monte Carlo (MCMC) methods are widely used to evaluate model probability for quantifying model uncertainty. In a general procedure, MCMC simulations are first conducted for each individual model, and MCMC parameter samples are then used to approximate marginal likelihood of the model by calculating the geometric mean of the joint likelihood of the model and its parameters. It has been found the method of evaluating geometric mean suffers from the numerical problem of low convergence rate. A simple test case shows that even millions of MCMC samples are insufficient to yield accurate estimation of the marginal likelihood. To resolve this problem, a thermodynamic method is used to have multiple MCMC runs with different values of a heating coefficient between zero and one. When the heating coefficient is zero, the MCMC run is equivalent to a random walk MC in the prior parameter space; when the heating coefficient is one, the MCMC run is the conventional one. For a simple case with analytical form of the marginal likelihood, the thermodynamic method yields more accurate estimate than the method of using geometric mean. This is also demonstrated for a case of groundwater modeling with consideration of four alternative models postulated based on different conceptualization of a confining layer. This groundwater example shows that model probabilities estimated using the thermodynamic method are more reasonable than those obtained using the geometric method. The thermodynamic method is general, and can be used for a wide range of environmental problem for model uncertainty quantification.

  9. LES/PDF studies of joint statistics of mixture fraction and progress variable in piloted methane jet flames with inhomogeneous inlet flows

    NASA Astrophysics Data System (ADS)

    Zhang, Pei; Barlow, Robert; Masri, Assaad; Wang, Haifeng

    2016-11-01

    The mixture fraction and progress variable are often used as independent variables for describing turbulent premixed and non-premixed flames. There is a growing interest in using these two variables for describing partially premixed flames. The joint statistical distribution of the mixture fraction and progress variable is of great interest in developing models for partially premixed flames. In this work, we conduct predictive studies of the joint statistics of mixture fraction and progress variable in a series of piloted methane jet flames with inhomogeneous inlet flows. The employed models combine large eddy simulations with the Monte Carlo probability density function (PDF) method. The joint PDFs and marginal PDFs are examined in detail by comparing the model predictions and the measurements. Different presumed shapes of the joint PDFs are also evaluated.

  10. System Guidelines for EMC Safety-Critical Circuits: Design, Selection, and Margin Demonstration

    NASA Technical Reports Server (NTRS)

    Lawton, R. M.

    1996-01-01

    Demonstration of required safety margins on critical electrical/electronic circuits in large complex systems has become an implementation and cost problem. These margins are the difference between the activation level of the circuit and the electrical noise on the circuit in the actual operating environment. This document discusses the origin of the requirement and gives a detailed process flow for the identification of the system electromagnetic compatibility (EMC) critical circuit list. The process flow discusses the roles of engineering disciplines such as systems engineering, safety, and EMC. Design and analysis guidelines are provided to assist the designer in assuring the system design has a high probability of meeting the margin requirements. Examples of approaches used on actual programs (Skylab and Space Shuttle Solid Rocket Booster) are provided to show how variations of the approach can be used successfully.

  11. Determinants of Pseudogymnoascus destructans within bat hibernacula: implications for surveillance and management of white-nose syndrome.

    PubMed

    Verant, Michelle L; Bohuski, Elizabeth A; Richgels, Katherine L D; Olival, Kevin J; Epstein, Jonathan H; Blehert, David S

    2018-01-01

    1. Fungal diseases are an emerging global problem affecting human health, food security and biodiversity. Ability of many fungal pathogens to persist within environmental reservoirs can increase extinction risks for host species and presents challenges for disease control. Understanding factors that regulate pathogen spread and persistence in these reservoirs is critical for effective disease management. 2. White-nose syndrome (WNS) is a disease of hibernating bats caused by Pseudogymnoascus destructans ( Pd ), a fungus that establishes persistent environmental reservoirs within bat hibernacula, which contribute to seasonal disease transmission dynamics in bats. However, host and environmental factors influencing distribution of Pd within these reservoirs are unknown. 3. We used model selection on longitudinally collected field data to test multiple hypotheses describing presence-absence and abundance of Pd in environmental substrates and on bats within hibernacula at different stages of WNS. 4. First detection of Pd in the environment lagged up to one year after first detection on bats within that hibernaculum. Once detected, the probability of detecting Pd within environmental samples from a hibernaculum increased over time and was higher in sediment compared to wall surfaces. Temperature had marginal effects on the distribution of Pd . For bats, prevalence and abundance of Pd were highest on Myotis lucifugus and on bats with visible signs of WNS. 5. Synthesis and applications . Our results indicate that distribution of Pseudogymnoascus destructans ( Pd ) within a hibernaculum is driven primarily by bats with delayed establishment of environmental reservoirs. Thus, collection of samples from Myotis lucifugus , or from sediment if bats cannot be sampled, should be prioritized to improve detection probabilities for Pd surveillance. Long-term persistence of Pd in sediment suggests that disease management for white-nose syndrome should address risks of sustained transmission from environmental reservoirs.

  12. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    NASA Astrophysics Data System (ADS)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  13. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.

    We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expressionmore » that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.« less

  14. From Weakly Chaotic Dynamics to Deterministic Subdiffusion via Copula Modeling

    NASA Astrophysics Data System (ADS)

    Nazé, Pierre

    2018-03-01

    Copula modeling consists in finding a probabilistic distribution, called copula, whereby its coupling with the marginal distributions of a set of random variables produces their joint distribution. The present work aims to use this technique to connect the statistical distributions of weakly chaotic dynamics and deterministic subdiffusion. More precisely, we decompose the jumps distribution of Geisel-Thomae map into a bivariate one and determine the marginal and copula distributions respectively by infinite ergodic theory and statistical inference techniques. We verify therefore that the characteristic tail distribution of subdiffusion is an extreme value copula coupling Mittag-Leffler distributions. We also present a method to calculate the exact copula and joint distributions in the case where weakly chaotic dynamics and deterministic subdiffusion statistical distributions are already known. Numerical simulations and consistency with the dynamical aspects of the map support our results.

  15. Paleomagnetic constraints on the interpretation of early Cenozoic Pacific Northwest paleogeography

    USGS Publications Warehouse

    Wells, Ray E.

    1984-01-01

    Widespread Cenozoic clockwise tectonic rotation in the Pacific Northwest is an established fact; however, the geologic reconstructions based on these rotations are the subject of continuing debate. Three basic mechanisms have been proposed to explain the rotations: (1) simple shear rotation of marginal terranes caught in the dextral shear couple between oceanic plates and North America; (2) rotation during oblique microplate collision and accretion to the continental margin; and (3) rotation of continental margin areas during episodes of intracontinental extension. In areas where detailed structure and stratigraphy are available, distributed shear rotations are amplv demonstrated paleomagnetically. However, rotation due to asymmetric interarc extension must be significant, especially for the Oregon Coast Range, in light of recent estimates of large Tertiary extension across the northern Basin and Range. The relative importance of shear versus extension is difficult to determine, but shear could account for nearly onehalf of the observed rotations. Oblique microplate collision has not contributed significantly to the observed Cenozoic rotations because most of the rotation post-dates collision-related deformation in the Oregon and Washington. Coast Range. The resultant continental reconstructions suggest that about 300 km of extension has occurred at 42°N. latitude (southern Oregon border) since early Eocene time. This reconstruction suggests that Cretaceous sedimentary basins east of the Klamath Mountains have undergone significant Tertiary extension (about f<0%) , but little rotation. Upper Cretaceous sedimentary rocks in the Blue Mountains of Oregon near Mitchell are probably rotated at least 15° and perhaps as much as 60°, which allows considerable latitude in the restoration of that part of the basin.

  16. Late Quaternary stratigraphy and sedimentation patterns in the western Arctic Ocean

    USGS Publications Warehouse

    Polyak, L.; Bischof, J.; Ortiz, J.D.; Darby, D.A.; Channell, J.E.T.; Xuan, C.; Kaufman, D.S.; Lovlie, R.; Schneider, D.A.; Eberl, D.D.; Adler, R.E.; Council, E.A.

    2009-01-01

    Sediment cores from the western Arctic Ocean obtained on the 2005 HOTRAX and some earlier expeditions have been analyzed to develop a stratigraphic correlation from the Alaskan Chukchi margin to the Northwind and Mendeleev-Alpha ridges. The correlation was primarily based on terrigenous sediment composition that is not affected by diagenetic processes as strongly as the biogenic component, and paleomagnetic inclination records. Chronostratigraphic control was provided by 14C dating and amino-acid racemization ages, as well as correlation to earlier established Arctic Ocean stratigraphies. Distribution of sedimentary units across the western Arctic indicates that sedimentation rates decrease from tens of centimeters per kyr on the Alaskan margin to a few centimeters on the southern ends of Northwind and Mendeleev ridges and just a few millimeters on the ridges in the interior of the Amerasia basin. This sedimentation pattern suggests that Late Quaternary sediment transport and deposition, except for turbidites at the basin bottom, were generally controlled by ice concentration (and thus melt-out rate) and transportation distance from sources, with local variances related to subsurface currents. In the long term, most sediment was probably delivered to the core sites by icebergs during glacial periods, with a significant contribution from sea ice. During glacial maxima very fine-grained sediment was deposited with sedimentation rates greatly reduced away from the margins to a hiatus of several kyr duration as shown for the Last Glacial Maximum. This sedimentary environment was possibly related to a very solid ice cover and reduced melt-out over a large part of the western Arctic Ocean.

  17. Oligocene to Holocene sediment drifts and bottom currents on the slope of Gabon continental margin (west Africa). Consequences for sedimentation and southeast Atlantic upwelling

    NASA Astrophysics Data System (ADS)

    Séranne, Michel; Nzé Abeigne, César-Rostand

    1999-10-01

    Seismic reflection profiles on the slope of the south Gabon continental margin display furrows 2 km wide and some 200 m deep, that develop normal to the margin in 500-1500 m water depth. Furrows are characterised by an aggradation/progradation pattern which leads to margin-parallel, northwestward migration of their axes through time. These structures, previously interpreted as turbidity current channels, display the distinctive seismic image and internal organisation of sediment drifts, constructed by the activity of bottom currents. Sediment drifts were initiated above a major Oligocene unconformity, and they developed within a Oligocene to Present megasequence of general progradation of the margin, whilst they are markedly absent from the underlying Late Cretaceous-Eocene aggradation megasequence. The presence of upslope migrating sediment waves, and the northwest migration of the sediment drifts indicate deposition by bottom current flowing upslope, under the influence of the Coriolis force. Such landwards-directed bottom currents on the slope probably represent coastal upwelling, which has been active along the west Africa margin throughout the Neogene.

  18. Marginal integrity of resin composite restorations restored with PPD initiatorcontaining resin composite cured by QTH, monowave and polywave LED units.

    PubMed

    Bortolotto, Tissiana; Betancourt, Francisco; Krejci, Ivo

    2016-12-01

    This study evaluated the influence of curing devices on marginal adaptation of cavities restored with self-etching adhesive containing CQ and PPD initiators and hybrid composite. Twenty-four class V (3 groups, n=8) with margins located on enamel and dentin were restored with Clearfil S3 Bond and Clearfil APX PLT, light-cured with a monowave LED, multiwave LED and halogen light-curing unit (LCU). Marginal adaptation was evaluated with SEM before/after thermo-mechanical loading (TML). On enamel, significantly lower % continuous margins (74.5±12.6) were found in group cured by multiwave LED when compared to monowave LED (87.6±9.5) and halogen LCU (94.4±9.1). The presence of enamel and composite fractures was significantly higher in the group light-cured with multiwave LED, probably due to an increased materials' friability resulted from an improved degree of cure. The clinician should aware that due to a distinct activation of both initiators, marginal quality may be influenced on the long-term.

  19. Transformation from Paleo-Asian Ocean closure to Paleo-Pacific subduction: New constraints from granitoids in the eastern Jilin-Heilongjiang Belt, NE China

    NASA Astrophysics Data System (ADS)

    Ma, Xing-Hua; Zhu, Wen-Ping; Zhou, Zhen-Hua; Qiao, Shi-Lei

    2017-08-01

    The eastern Jilin-Heilongjiang Belt (EJHB) of NE China is a unique orogen that underwent two stages of evolution within the tectonic regimes of the Paleo-Asian and Paleo-Pacific oceans. 158 available zircon U-Pb ages, including 26 ages obtained during the present study and 132 ages from the literature, were compiled and analyzed for the Mesozoic and Cenozoic granitoids from the EJHB and the adjacent Russian Sikhote-Alin Orogenic Belt (SAOB), to examine the temporal-spatial distribution of the granitoids and to constrain the tectonic evolution of the East Asian continental margin. Five stages of granitic magmatism can be identified: Early Triassic (251-240 Ma), Late Triassic (228-215 Ma), latest Triassic to Middle Jurassic (213-158 Ma), Early Cretaceous (131-105 Ma), and Late Cretaceous to Paleocene (95-56 Ma). The Early Triassic granitoids are restricted to the Yanbian region along the Changchun-Yanji Suture, and show geochemical characteristics of magmas from a thickened lower crust source, probably due to the final collision of the combined NE China blocks with the North China Craton. The Late Triassic granitoids, with features of A-type granites, represent post-collisional magmatic activities that were related to post-orogenic extension, marking the end of the tectonic evolution of the Paleo-Asian Ocean. The latest Triassic to Paleocene granitoids with calc-alkaline characteristics were NE-trending emplaced along the EJHB and SAOB and young towards the coastal region, and represent continental marginal arc magmas that were associated with the northwestwards subduction of the Paleo-Pacific Plate. Two periods of magmatic quiescence (158-131 and 105-95 Ma) correspond to changes in the subduction direction of the Paleo-Pacific Plate from oblique relative to the continental margin to subparallel. Taking all this into account, we conclude that: (1) the final closure of the Paleo-Asian Ocean occurred along the Changchun-Yanji Suture during the Early Triassic; (2) the onset of the subduction of the Paleo-Pacific Plate beneath the East Asian continental margin probably took place during the latest Triassic (ca. 215 Ma); (3) changes in the drifting direction of the Paleo-Pacific Plate were responsible for the intermittent magmatic activities; and (4) roll-back of the subducted plate resulted in the oceanwards migration of the magmatic arc and large-scale back-arc extension throughout NE China during the Early Cretaceous.

  20. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    PubMed

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  1. Discretized kinetic theory on scale-free networks

    NASA Astrophysics Data System (ADS)

    Bertotti, Maria Letizia; Modanese, Giovanni

    2016-10-01

    The network of interpersonal connections is one of the possible heterogeneous factors which affect the income distribution emerging from micro-to-macro economic models. In this paper we equip our model discussed in [1, 2] with a network structure. The model is based on a system of n differential equations of the kinetic discretized-Boltzmann kind. The network structure is incorporated in a probabilistic way, through the introduction of a link density P(α) and of correlation coefficients P(β|α), which give the conditioned probability that an individual with α links is connected to one with β links. We study the properties of the equations and give analytical results concerning the existence, normalization and positivity of the solutions. For a fixed network with P(α) = c/α q , we investigate numerically the dependence of the detailed and marginal equilibrium distributions on the initial conditions and on the exponent q. Our results are compatible with those obtained from the Bouchaud-Mezard model and from agent-based simulations, and provide additional information about the dependence of the individual income on the level of connectivity.

  2. Distribution of total and fecal coliform organisms from septic effluent in selected coastal plain soils.

    PubMed Central

    Reneau, R B; Pettry, D E; Shanholtz, M I; Graham, S A; Weston, C W

    1977-01-01

    Distribution of total and fecal coliform bacteria in three Atlantic coastal plain soils in Virginia were monitored in situ over a 3-year period. The soils studied were Varina, Goldsboro, and Beltsville sandy loams. These and similar soils are found extensively along the populous Atlantic seaboard of the United States. They are considered only marginally suitable for septic tank installation because the restricting soil layers result in the subsequent development of seasonal perched water tables. To determine both horizontal and vertical movement of indicator organisms, samples were collected from piezometers placed at selected distances and depths from the drainfields in the direction of the ground water flow. Large reductions in total and fecal coliform bacteria were noted in the perched ground waters above the restricting layers as distance from the drainfield increased. These restricting soil layers appear to be effective barriers to the vertical movement of indicator organisms. The reduction in the density of the coliform bacteria above the restricting soil layers can probably be attributed to dilution, filtration, and dieoff as the bacteria move through the natural soil systems. PMID:325589

  3. Climate change and the decline of a once common bird.

    PubMed

    McClure, Christopher J W; Rolek, Brian W; McDonald, Kenneth; Hill, Geoffrey E

    2012-02-01

    Climate change is predicted to negatively impact wildlife through a variety of mechanisms including retraction of range. We used data from the North American Breeding Bird Survey and regional and global climate indices to examine the effects of climate change on the breeding distribution of the Rusty Blackbird (Euphagus carolinus), a formerly common species that is rapidly declining. We found that the range of the Rusty Blackbird retracted northward by 143 km since the 1960s and that the probability of local extinction was highest at the southern range margin. Furthermore, we found that the mean breeding latitude of the Rusty Blackbird was significant and positively correlated with the Pacific Decadal Oscillation with a lag of six years. Because the annual distribution of the Rusty Blackbird is affected by annual weather patterns produced by the Pacific Decadal Oscillation, our results support the hypothesis that directional climate change over the past 40 years is contributing to the decline of the Rusty Blackbird. Our study is the first to implicate climate change, acting through range retraction, in a major decline of a formerly common bird species.

  4. Uncertainties in Parameters Estimated with Neural Networks: Application to Strong Gravitational Lensing

    DOE PAGES

    Perreault Levasseur, Laurence; Hezaveh, Yashar D.; Wechsler, Risa H.

    2017-11-15

    In Hezaveh et al. (2017) we showed that deep learning can be used for model parameter estimation and trained convolutional neural networks to determine the parameters of strong gravitational lensing systems. Here we demonstrate a method for obtaining the uncertainties of these parameters. We review the framework of variational inference to obtain approximate posteriors of Bayesian neural networks and apply it to a network trained to estimate the parameters of the Singular Isothermal Ellipsoid plus external shear and total flux magnification. We show that the method can capture the uncertainties due to different levels of noise in the input data,more » as well as training and architecture-related errors made by the network. To evaluate the accuracy of the resulting uncertainties, we calculate the coverage probabilities of marginalized distributions for each lensing parameter. By tuning a single hyperparameter, the dropout rate, we obtain coverage probabilities approximately equal to the confidence levels for which they were calculated, resulting in accurate and precise uncertainty estimates. Our results suggest that neural networks can be a fast alternative to Monte Carlo Markov Chains for parameter uncertainty estimation in many practical applications, allowing more than seven orders of magnitude improvement in speed.« less

  5. Uncertainties in Parameters Estimated with Neural Networks: Application to Strong Gravitational Lensing

    NASA Astrophysics Data System (ADS)

    Perreault Levasseur, Laurence; Hezaveh, Yashar D.; Wechsler, Risa H.

    2017-11-01

    In Hezaveh et al. we showed that deep learning can be used for model parameter estimation and trained convolutional neural networks to determine the parameters of strong gravitational-lensing systems. Here we demonstrate a method for obtaining the uncertainties of these parameters. We review the framework of variational inference to obtain approximate posteriors of Bayesian neural networks and apply it to a network trained to estimate the parameters of the Singular Isothermal Ellipsoid plus external shear and total flux magnification. We show that the method can capture the uncertainties due to different levels of noise in the input data, as well as training and architecture-related errors made by the network. To evaluate the accuracy of the resulting uncertainties, we calculate the coverage probabilities of marginalized distributions for each lensing parameter. By tuning a single variational parameter, the dropout rate, we obtain coverage probabilities approximately equal to the confidence levels for which they were calculated, resulting in accurate and precise uncertainty estimates. Our results suggest that the application of approximate Bayesian neural networks to astrophysical modeling problems can be a fast alternative to Monte Carlo Markov Chains, allowing orders of magnitude improvement in speed.

  6. Uncertainties in Parameters Estimated with Neural Networks: Application to Strong Gravitational Lensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perreault Levasseur, Laurence; Hezaveh, Yashar D.; Wechsler, Risa H.

    In Hezaveh et al. (2017) we showed that deep learning can be used for model parameter estimation and trained convolutional neural networks to determine the parameters of strong gravitational lensing systems. Here we demonstrate a method for obtaining the uncertainties of these parameters. We review the framework of variational inference to obtain approximate posteriors of Bayesian neural networks and apply it to a network trained to estimate the parameters of the Singular Isothermal Ellipsoid plus external shear and total flux magnification. We show that the method can capture the uncertainties due to different levels of noise in the input data,more » as well as training and architecture-related errors made by the network. To evaluate the accuracy of the resulting uncertainties, we calculate the coverage probabilities of marginalized distributions for each lensing parameter. By tuning a single hyperparameter, the dropout rate, we obtain coverage probabilities approximately equal to the confidence levels for which they were calculated, resulting in accurate and precise uncertainty estimates. Our results suggest that neural networks can be a fast alternative to Monte Carlo Markov Chains for parameter uncertainty estimation in many practical applications, allowing more than seven orders of magnitude improvement in speed.« less

  7. Probabilistic modelling of drought events in China via 2-dimensional joint copula

    NASA Astrophysics Data System (ADS)

    Ayantobo, Olusola O.; Li, Yi; Song, Songbai; Javed, Tehseen; Yao, Ning

    2018-04-01

    Probabilistic modelling of drought events is a significant aspect of water resources management and planning. In this study, popularly applied and several relatively new bivariate Archimedean copulas were employed to derive regional and spatial based copula models to appraise drought risk in mainland China over 1961-2013. Drought duration (Dd), severity (Ds), and peak (Dp), as indicated by Standardized Precipitation Evapotranspiration Index (SPEI), were extracted according to the run theory and fitted with suitable marginal distributions. The maximum likelihood estimation (MLE) and curve fitting method (CFM) were used to estimate the copula parameters of nineteen bivariate Archimedean copulas. Drought probabilities and return periods were analysed based on appropriate bivariate copula in sub-region I-VII and entire mainland China. The goodness-of-fit tests as indicated by the CFM showed that copula NN19 in sub-regions III, IV, V, VI and mainland China, NN20 in sub-region I and NN13 in sub-region VII are the best for modeling drought variables. Bivariate drought probability across mainland China is relatively high, and the highest drought probabilities are found mainly in the Northwestern and Southwestern China. Besides, the result also showed that different sub-regions might suffer varying drought risks. The drought risks as observed in Sub-region III, VI and VII, are significantly greater than other sub-regions. Higher probability of droughts of longer durations in the sub-regions also corresponds to shorter return periods with greater drought severity. These results may imply tremendous challenges for the water resources management in different sub-regions, particularly the Northwestern and Southwestern China.

  8. Revealing nonclassicality beyond Gaussian states via a single marginal distribution

    PubMed Central

    Park, Jiyong; Lu, Yao; Lee, Jaehak; Shen, Yangchao; Zhang, Kuan; Zhang, Shuaining; Zubairy, Muhammad Suhail; Kim, Kihwan; Nha, Hyunchul

    2017-01-01

    A standard method to obtain information on a quantum state is to measure marginal distributions along many different axes in phase space, which forms a basis of quantum-state tomography. We theoretically propose and experimentally demonstrate a general framework to manifest nonclassicality by observing a single marginal distribution only, which provides a unique insight into nonclassicality and a practical applicability to various quantum systems. Our approach maps the 1D marginal distribution into a factorized 2D distribution by multiplying the measured distribution or the vacuum-state distribution along an orthogonal axis. The resulting fictitious Wigner function becomes unphysical only for a nonclassical state; thus the negativity of the corresponding density operator provides evidence of nonclassicality. Furthermore, the negativity measured this way yields a lower bound for entanglement potential—a measure of entanglement generated using a nonclassical state with a beam-splitter setting that is a prototypical model to produce continuous-variable (CV) entangled states. Our approach detects both Gaussian and non-Gaussian nonclassical states in a reliable and efficient manner. Remarkably, it works regardless of measurement axis for all non-Gaussian states in finite-dimensional Fock space of any size, also extending to infinite-dimensional states of experimental relevance for CV quantum informatics. We experimentally illustrate the power of our criterion for motional states of a trapped ion, confirming their nonclassicality in a measurement-axis–independent manner. We also address an extension of our approach combined with phase-shift operations, which leads to a stronger test of nonclassicality, that is, detection of genuine non-Gaussianity under a CV measurement. PMID:28077456

  9. Revealing nonclassicality beyond Gaussian states via a single marginal distribution.

    PubMed

    Park, Jiyong; Lu, Yao; Lee, Jaehak; Shen, Yangchao; Zhang, Kuan; Zhang, Shuaining; Zubairy, Muhammad Suhail; Kim, Kihwan; Nha, Hyunchul

    2017-01-31

    A standard method to obtain information on a quantum state is to measure marginal distributions along many different axes in phase space, which forms a basis of quantum-state tomography. We theoretically propose and experimentally demonstrate a general framework to manifest nonclassicality by observing a single marginal distribution only, which provides a unique insight into nonclassicality and a practical applicability to various quantum systems. Our approach maps the 1D marginal distribution into a factorized 2D distribution by multiplying the measured distribution or the vacuum-state distribution along an orthogonal axis. The resulting fictitious Wigner function becomes unphysical only for a nonclassical state; thus the negativity of the corresponding density operator provides evidence of nonclassicality. Furthermore, the negativity measured this way yields a lower bound for entanglement potential-a measure of entanglement generated using a nonclassical state with a beam-splitter setting that is a prototypical model to produce continuous-variable (CV) entangled states. Our approach detects both Gaussian and non-Gaussian nonclassical states in a reliable and efficient manner. Remarkably, it works regardless of measurement axis for all non-Gaussian states in finite-dimensional Fock space of any size, also extending to infinite-dimensional states of experimental relevance for CV quantum informatics. We experimentally illustrate the power of our criterion for motional states of a trapped ion, confirming their nonclassicality in a measurement-axis-independent manner. We also address an extension of our approach combined with phase-shift operations, which leads to a stronger test of nonclassicality, that is, detection of genuine non-Gaussianity under a CV measurement.

  10. On the distribution of species occurrence

    USGS Publications Warehouse

    Buzas, Martin A.; Koch, Carl F.; Culver, Stephen J.; Sohl, Norman F.

    1982-01-01

    The distribution of species abundance (number of individuals per species) is well documented. The distribution of species occurrence (number of localities per species), however, has received little attention. This study investigates the distribution of species occurrence for five large data sets. For modern benthic foraminifera, species occurrence is examined from the Atlantic continental margin of North America, where 875 species were recorded 10,017 times at 542 localities, the Gulf of Mexico, where 848 species were recorded 18,007 times at 426 localities, and the Caribbean, where 1,149 species were recorded 6,684 times at 268 localities. For Late Cretaceous molluscs, species occurrence is examined from the Gulf Coast where 716 species were recorded 6,236 times at 166 localities and a subset of this data consisting of 643 species recorded 3,851 times at 86 localities.Logseries and lognormal distributions were fitted to these data sets. In most instances the logseries best predicts the distribution of species occurrence. The lognormal, however, also fits the data fairly well, and, in one instance, better. The use of these distributions allows the prediction of the number of species occurring once, twice, ..., n times.Species abundance data are also available for the molluscan data sets. They indicate that the most abundant species (greatest number of individuals) usually occur most frequently. In all data sets approximately half the species occur four or less times. The probability of noting the presence of rarely occurring species is small, and, consequently, such species must be used with extreme caution in studies requiring knowledge of the distribution of species in space and time.

  11. Maximum entropy approach to H -theory: Statistical mechanics of hierarchical systems

    NASA Astrophysics Data System (ADS)

    Vasconcelos, Giovani L.; Salazar, Domingos S. P.; Macêdo, A. M. S.

    2018-02-01

    A formalism, called H-theory, is applied to the problem of statistical equilibrium of a hierarchical complex system with multiple time and length scales. In this approach, the system is formally treated as being composed of a small subsystem—representing the region where the measurements are made—in contact with a set of "nested heat reservoirs" corresponding to the hierarchical structure of the system, where the temperatures of the reservoirs are allowed to fluctuate owing to the complex interactions between degrees of freedom at different scales. The probability distribution function (pdf) of the temperature of the reservoir at a given scale, conditioned on the temperature of the reservoir at the next largest scale in the hierarchy, is determined from a maximum entropy principle subject to appropriate constraints that describe the thermal equilibrium properties of the system. The marginal temperature distribution of the innermost reservoir is obtained by integrating over the conditional distributions of all larger scales, and the resulting pdf is written in analytical form in terms of certain special transcendental functions, known as the Fox H functions. The distribution of states of the small subsystem is then computed by averaging the quasiequilibrium Boltzmann distribution over the temperature of the innermost reservoir. This distribution can also be written in terms of H functions. The general family of distributions reported here recovers, as particular cases, the stationary distributions recently obtained by Macêdo et al. [Phys. Rev. E 95, 032315 (2017), 10.1103/PhysRevE.95.032315] from a stochastic dynamical approach to the problem.

  12. Maximum entropy approach to H-theory: Statistical mechanics of hierarchical systems.

    PubMed

    Vasconcelos, Giovani L; Salazar, Domingos S P; Macêdo, A M S

    2018-02-01

    A formalism, called H-theory, is applied to the problem of statistical equilibrium of a hierarchical complex system with multiple time and length scales. In this approach, the system is formally treated as being composed of a small subsystem-representing the region where the measurements are made-in contact with a set of "nested heat reservoirs" corresponding to the hierarchical structure of the system, where the temperatures of the reservoirs are allowed to fluctuate owing to the complex interactions between degrees of freedom at different scales. The probability distribution function (pdf) of the temperature of the reservoir at a given scale, conditioned on the temperature of the reservoir at the next largest scale in the hierarchy, is determined from a maximum entropy principle subject to appropriate constraints that describe the thermal equilibrium properties of the system. The marginal temperature distribution of the innermost reservoir is obtained by integrating over the conditional distributions of all larger scales, and the resulting pdf is written in analytical form in terms of certain special transcendental functions, known as the Fox H functions. The distribution of states of the small subsystem is then computed by averaging the quasiequilibrium Boltzmann distribution over the temperature of the innermost reservoir. This distribution can also be written in terms of H functions. The general family of distributions reported here recovers, as particular cases, the stationary distributions recently obtained by Macêdo et al. [Phys. Rev. E 95, 032315 (2017)10.1103/PhysRevE.95.032315] from a stochastic dynamical approach to the problem.

  13. Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks

    NASA Technical Reports Server (NTRS)

    Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris

    2015-01-01

    Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.

  14. Distributions of Pu, Am and Cs in margin sediments from the western Mediterranean (Spanish coast).

    PubMed

    Gascó, C; Antón, M P; Pozuelo, M; Meral, J; González, A M; Papucci, C; Delfanti, R

    2002-01-01

    Continental margins are important areas to be considered when studying the distributions and depositions of pollutants, both conventional and radioactive. Coastal sediments accumulate most of those contaminants which can be introduced following atmospheric and/or fluvial pathways. Moreover, their residence times within the water column are usually shortened due to their affinity to associate with the downward falling particulate matter, more abundant at shallower depths. In this paper the distribution profiles and inventories of plutonium, americium and cesium are detailed, providing useful information about recent sedimentation phenomena such as sediment mixing, slumping processes and bioturbation. Unsupported 210Pb data are used as reliable indicators of enhanced/reduced deposition events. Also, the calculated inventories have enabled the estimation of the radiological contribution of the Spanish Mediterranean margin to the total radioactivity deposited onto the Mediterranean sea floor.

  15. Estimation of value at risk in currency exchange rate portfolio using asymmetric GJR-GARCH Copula

    NASA Astrophysics Data System (ADS)

    Nurrahmat, Mohamad Husein; Noviyanti, Lienda; Bachrudin, Achmad

    2017-03-01

    In this study, we discuss the problem in measuring the risk in a portfolio based on value at risk (VaR) using asymmetric GJR-GARCH Copula. The approach based on the consideration that the assumption of normality over time for the return can not be fulfilled, and there is non-linear correlation for dependent model structure among the variables that lead to the estimated VaR be inaccurate. Moreover, the leverage effect also causes the asymmetric effect of dynamic variance and shows the weakness of the GARCH models due to its symmetrical effect on conditional variance. Asymmetric GJR-GARCH models are used to filter the margins while the Copulas are used to link them together into a multivariate distribution. Then, we use copulas to construct flexible multivariate distributions with different marginal and dependence structure, which is led to portfolio joint distribution does not depend on the assumptions of normality and linear correlation. VaR obtained by the analysis with confidence level 95% is 0.005586. This VaR derived from the best Copula model, t-student Copula with marginal distribution of t distribution.

  16. A general method for the definition of margin recipes depending on the treatment technique applied in helical tomotherapy prostate plans.

    PubMed

    Sevillano, David; Mínguez, Cristina; Sánchez, Alicia; Sánchez-Reyes, Alberto

    2016-01-01

    To obtain specific margin recipes that take into account the dosimetric characteristics of the treatment plans used in a single institution. We obtained dose-population histograms (DPHs) of 20 helical tomotherapy treatment plans for prostate cancer by simulating the effects of different systematic errors (Σ) and random errors (σ) on these plans. We obtained dosimetric margins and margin reductions due to random errors (random margins) by fitting the theoretical results of coverages for Gaussian distributions with coverages of the planned D99% obtained from the DPHs. The dosimetric margins obtained for helical tomotherapy prostate treatments were 3.3 mm, 3 mm, and 1 mm in the lateral (Lat), anterior-posterior (AP), and superior-inferior (SI) directions. Random margins showed parabolic dependencies, yielding expressions of 0.16σ(2), 0.13σ(2), and 0.15σ(2) for the Lat, AP, and SI directions, respectively. When focusing on values up to σ = 5 mm, random margins could be fitted considering Gaussian penumbras with standard deviations (σp) equal to 4.5 mm Lat, 6 mm AP, and 5.5 mm SI. Despite complex dose distributions in helical tomotherapy treatment plans, we were able to simplify the behaviour of our plans against treatment errors to single values of dosimetric and random margins for each direction. These margins allowed us to develop specific margin recipes for the respective treatment technique. The method is general and could be used for any treatment technique provided that DPHs can be obtained. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. A note about Gaussian statistics on a sphere

    NASA Astrophysics Data System (ADS)

    Chave, Alan D.

    2015-11-01

    The statistics of directional data on a sphere can be modelled either using the Fisher distribution that is conditioned on the magnitude being unity, in which case the sample space is confined to the unit sphere, or using the latitude-longitude marginal distribution derived from a trivariate Gaussian model that places no constraint on the magnitude. These two distributions are derived from first principles and compared. The Fisher distribution more closely approximates the uniform distribution on a sphere for a given small value of the concentration parameter, while the latitude-longitude marginal distribution is always slightly larger than the Fisher distribution at small off-axis angles for large values of the concentration parameter. Asymptotic analysis shows that the two distributions only become equivalent in the limit of large concentration parameter and very small off-axis angle.

  18. Misfit and fracture load of implant-supported monolithic crowns in zirconia-reinforced lithium silicate

    PubMed Central

    GOMES, Rafael Soares; de SOUZA, Caroline Mathias Carvalho; BERGAMO, Edmara Tatiely Pedroso; BORDIN, Dimorvan; DEL BEL CURY, Altair Antoninha

    2017-01-01

    Abstract Zirconia-reinforced lithium silicate (ZLS) is a ceramic that promises to have better mechanical properties than other materials with the same indications as well as improved adaptation and fracture strength. Objective In this study, marginal and internal misfit and fracture load with and without thermal-mechanical aging (TMA) of monolithic ZLS and lithium disilicate (LDS) crowns were evaluated. Material and methods Crowns were milled using a computer-aided design/computer-aided manufacturing system. Marginal gaps (MGs), absolute marginal discrepancy (AMD), axial gaps, and occlusal gaps were measured by X-ray microtomography (n=8). For fracture load testing, crowns were cemented in a universal abutment, and divided into four groups: ZLS without TMA, ZLS with TMA, LDS without TMA, and LDS with TMA (n=10). TMA groups were subjected to 10,000 thermal cycles (5-55°C) and 1,000,000 mechanical cycles (200 N, 3.8 Hz). All groups were subjected to compressive strength testing in a universal testing machine at a crosshead speed of 1 mm/min until failure. Student’s t-test was used to examine misfit, two-way analysis of variance was used to analyze fracture load, and Pearson’s correlation coefficients for misfit and fracture load were calculated (α=0.05). The materials were analyzed according to Weibull distribution, with 95% confidence intervals. Results Average MG (p<0.001) and AMD (p=0.003) values were greater in ZLS than in LDS crowns. TMA did not affect the fracture load of either material. However, fracture loads of ZLS crowns were lower than those of LDS crowns (p<0.001). Fracture load was moderately correlated with MG (r=-0.553) and AMD (r=-0.497). ZLS with TMA was least reliable, according to Weibull probability. Conclusion Within the limitations of this study, ZLS crowns had lower fracture load values and greater marginal misfit than did LDS crowns, although these values were within acceptable limits. PMID:28678947

  19. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.

  20. The origin of Karaj dam basement sill marginal reversal by Soret fractionation

    NASA Astrophysics Data System (ADS)

    Maghdour-Mashhour, Reza; Esmaeily, Dariush

    2010-05-01

    The Karaj dam basement sill (KDBS), located North West of Tehran, northern Iran, is one of the several E-W-trending plutons in the Albourz Mountains. The KDBS consists of a layered series between upper and lower chilled margins. The rocks of the chilled margins are gabbroic in composition and porphyritic, with euhedral to subhedral plagioclase and clinopyroxene megacrysts up to 5 mm long. The rocks become coarse-grained toward the center of the sill and show a gradual transition from porphyritic to equigranular texture. Field and petrographic observations reveal a reverse trend in marginal units crystallization from the eutectic point to the main magma composition; i.e., the olivine-bearing gabbro (porphyritic chilled margin), which has a eutectic composition, crystallized prior to the marginal gabbros, which have a cotectic or near-cotectic composition, as plagioclase laths in the gabbroic unit are embedded in large crystals of clinopyroxene and this phenomenon is believed to result from the cotectic crystallization of plagioclase and clinopyroxene. Four major mechanisms are proposed and discussed in order to find the exact mechanism responsible for marginal reversal formation as following: 1) Crystal settling is a gravity-dependent mechanism and phenocrysts must have settled to form a layer at the bottom of the sill, showing sharp upper boundary which is not observable in KDBS. Besides, the reverse fractionation of inwardly-dipping sequence of mentioned sill occurs in layers with primary dips up to 55°. Consequently capability of marginal reversals to develop along steeply inclined chamber margins, by this mechanism is implausible. 2) Multiple injections of successive magma pulses fails to explain the origin of marginal reversal since the transition along the entire length of marginal reversal is gradual also there is no compositional break or chilled contact between two mentioned units of KDBS margin (Olivine-gabbro and marginal gabbro). 3) The idea of supercooling could not be applied to the marginal series as well. Because the high degree of supercooling would make the marginal series with a fine-grained chilled margins which is far from being observed in KDBS. 4) Soret fractionation is the most probable mechanism among the others which has recently taken into account by researchers (e.g. Latypov, 2003). As shown by Worster et al. (1990), vigorous convection can be logically assumed to be a major process responsible for the generation of smooth trends in the distribution of cumulate minerals in the accumulation zone and the gradual transition between different layers, which are characteristic features of the KDBS confirm the occurrence of vigorous convection in the main magma reservoir. Vigorous magma convection in the KDBS chamber leads to the formation of a thin thermal boundary layer along the chamber margins, thereby maintaining the temperature contrast from fading and ensuring the continuous exchange of HMPCs (e.g., MgO) across the liquid boundary layer from the margins toward the main magma chamber and for LMPCs in opposite direction, by Soret diffusion. Consequently further decrease in the magnitude of the thermal gradient within the boundary layer would have caused a gradual change in the composition of the liquids in the boundary layer, becoming progressively depleted in LMPCs, shifting away from the eutectic point Ol + Pl + Cpx + L along the cotectic line Cpx + Pl + L. In this way, crystallization of the developing cumulus front in the liquid boundary layer produced the compositional sequence in reverse order, from the eutectic point to the initial parental magma, resulting in the occurrence of olivine-bearing gabbro in the chilled margins and gabbros toward the center of chamber.

  1. Three-Phase AC Optimal Power Flow Based Distribution Locational Marginal Price: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Rui; Zhang, Yingchen

    2017-05-17

    Designing market mechanisms for electricity distribution systems has been a hot topic due to the increased presence of smart loads and distributed energy resources (DERs) in distribution systems. The distribution locational marginal pricing (DLMP) methodology is one of the real-time pricing methods to enable such market mechanisms and provide economic incentives to active market participants. Determining the DLMP is challenging due to high power losses, the voltage volatility, and the phase imbalance in distribution systems. Existing DC Optimal Power Flow (OPF) approaches are unable to model power losses and the reactive power, while single-phase AC OPF methods cannot capture themore » phase imbalance. To address these challenges, in this paper, a three-phase AC OPF based approach is developed to define and calculate DLMP accurately. The DLMP is modeled as the marginal cost to serve an incremental unit of demand at a specific phase at a certain bus, and is calculated using the Lagrange multipliers in the three-phase AC OPF formulation. Extensive case studies have been conducted to understand the impact of system losses and the phase imbalance on DLMPs as well as the potential benefits of flexible resources.« less

  2. A statistical approach to nuclear fuel design and performance

    NASA Astrophysics Data System (ADS)

    Cunning, Travis Andrew

    As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance, with an average sensitivity index of 48.93% on key output quantities. Pellet grain size and dish depth are also significant contributors, at 31.53% and 13.46%, respectively. A traditional limit of operating envelope case is also evaluated. This case produces output values that exceed the maximum values observed during the 105 Monte Carlo trials for all output quantities of interest. In many cases the difference between the predictions of the two methods is very prominent, and the highly conservative nature of the deterministic approach is demonstrated. A reliability analysis of CANDU fuel manufacturing parametric data, specifically pertaining to the quantification of fuel performance margins, has not been conducted previously. Key Words: CANDU, nuclear fuel, Cameco, fuel manufacturing, fuel modelling, fuel performance, fuel reliability, ELESTRES, ELOCA, dimensional reduction methods, global sensitivity analysis, deterministic safety analysis, probabilistic safety analysis.

  3. Investigation of Dielectric Breakdown Characteristics for Double-break Vacuum Interrupter and Dielectric Breakdown Probability Distribution in Vacuum Interrupter

    NASA Astrophysics Data System (ADS)

    Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi

    To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.

  4. Genealogical Working Distributions for Bayesian Model Testing with Phylogenetic Uncertainty

    PubMed Central

    Baele, Guy; Lemey, Philippe; Suchard, Marc A.

    2016-01-01

    Marginal likelihood estimates to compare models using Bayes factors frequently accompany Bayesian phylogenetic inference. Approaches to estimate marginal likelihoods have garnered increased attention over the past decade. In particular, the introduction of path sampling (PS) and stepping-stone sampling (SS) into Bayesian phylogenetics has tremendously improved the accuracy of model selection. These sampling techniques are now used to evaluate complex evolutionary and population genetic models on empirical data sets, but considerable computational demands hamper their widespread adoption. Further, when very diffuse, but proper priors are specified for model parameters, numerical issues complicate the exploration of the priors, a necessary step in marginal likelihood estimation using PS or SS. To avoid such instabilities, generalized SS (GSS) has recently been proposed, introducing the concept of “working distributions” to facilitate—or shorten—the integration process that underlies marginal likelihood estimation. However, the need to fix the tree topology currently limits GSS in a coalescent-based framework. Here, we extend GSS by relaxing the fixed underlying tree topology assumption. To this purpose, we introduce a “working” distribution on the space of genealogies, which enables estimating marginal likelihoods while accommodating phylogenetic uncertainty. We propose two different “working” distributions that help GSS to outperform PS and SS in terms of accuracy when comparing demographic and evolutionary models applied to synthetic data and real-world examples. Further, we show that the use of very diffuse priors can lead to a considerable overestimation in marginal likelihood when using PS and SS, while still retrieving the correct marginal likelihood using both GSS approaches. The methods used in this article are available in BEAST, a powerful user-friendly software package to perform Bayesian evolutionary analyses. PMID:26526428

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGinnis, J.P.; Karner, G.D.; Driscoll, N.W.

    The tectonic and stratigraphic development of the Congo continental margin reflects the timing, magnitude, and distribution of lithospheric extension responsible for its formation. Details of the lithospheric extension process are recorded in the stratigraphic successions preserved along and across the margin. By using the stratal relationships (e.g., onlap, downlap, and truncation) and lithofacies determined from seismic reflection and exploratory well data as input into our basin-modeling strategy, we have developed an integrated approach to determine the relationship between the timing, magnitude, and distribution of lithospheric extension across the margin. Two hinge zones, an eastern and Atlantic hinge formed along themore » Congo margin in response to discrete extensional events occurring from the Berriasian to the Aptian. The eastern hinge zone demarcates the eastern limit of the broadly distributed Berriasian extension. This extension resulted in the formation of deep anoxic, lacustrine systems. In contrast, the Atlantic hinge, located [approximately]90 km west of the eastern hinge, marks the eastern limit of a second phase of extension, which began in the Hauterivian. Consequent footwall uplift and rotation exposed the earlier synrift and prerift stratigraphy to at least wave base causing varying amounts of erosional truncation across the Atlantic hinge zone along much of the Gabon, Congo, and Angola margins. The absence of the Melania Formation across the Congo margin implies that uplift of the Atlantic hinge was relatively minor compared to that across the Angola and Gabon margins. In addition, material eroded from the adjacent and topographically higher hinge zones may in part account for the thick wedge of sediment deposited seaward of the Congo Atlantic hinge. A third phase of extension reactivated both the eastern and Atlantic hinge zones and was responsible for creating the accommodation space for Marnes Noires source rock deposition.« less

  6. Application of Archimedean copulas to the analysis of drought decadal variation in China

    NASA Astrophysics Data System (ADS)

    Zuo, Dongdong; Feng, Guolin; Zhang, Zengping; Hou, Wei

    2017-12-01

    Based on daily precipitation data collected from 1171 stations in China during 1961-2015, the monthly standardized precipitation index was derived and used to extract two major drought characteristics which are drought duration and severity. Next, a bivariate joint model was established based on the marginal distributions of the two variables and Archimedean copula functions. The joint probability and return period were calculated to analyze the drought characteristics and decadal variation. According to the fit analysis, the Gumbel-Hougaard copula provided the best fit to the observed data. Based on four drought duration classifications and four severity classifications, the drought events were divided into 16 drought types according to the different combinations of duration and severity classifications, and the probability and return period were analyzed for different drought types. The results showed that the occurring probability of six common drought types (0 < D ≤ 1 and 0.5 < S ≤ 1, 1 < D ≤ 3 and 0.5 < S ≤ 1, 1 < D ≤ 3 and 1 < S ≤ 1.5, 1 < D ≤ 3 and 1.5 < S ≤ 2, 1 < D ≤ 3 and 2 < S, and 3 < D ≤ 6 and 2 < S) accounted for 76% of the total probability of all types. Moreover, due to their greater variation, two drought types were particularly notable, i.e., the drought types where D ≥ 6 and S ≥ 2. Analyzing the joint probability in different decades indicated that the location of the drought center had a distinctive stage feature, which cycled from north to northeast to southwest during 1961-2015. However, southwest, north, and northeast China had a higher drought risk. In addition, the drought situation in southwest China should be noted because the joint probability values, return period, and the analysis of trends in the drought duration and severity all indicated a considerable risk in recent years.

  7. Using a Betabinomial distribution to estimate the prevalence of adherence to physical activity guidelines among children and youth.

    PubMed

    Garriguet, Didier

    2016-04-01

    Estimates of the prevalence of adherence to physical activity guidelines in the population are generally the result of averaging individual probability of adherence based on the number of days people meet the guidelines and the number of days they are assessed. Given this number of active and inactive days (days assessed minus days active), the conditional probability of meeting the guidelines that has been used in the past is a Beta (1 + active days, 1 + inactive days) distribution assuming the probability p of a day being active is bounded by 0 and 1 and averages 50%. A change in the assumption about the distribution of p is required to better match the discrete nature of the data and to better assess the probability of adherence when the percentage of active days in the population differs from 50%. Using accelerometry data from the Canadian Health Measures Survey, the probability of adherence to physical activity guidelines is estimated using a conditional probability given the number of active and inactive days distributed as a Betabinomial(n, a + active days , β + inactive days) assuming that p is randomly distributed as Beta(a, β) where the parameters a and β are estimated by maximum likelihood. The resulting Betabinomial distribution is discrete. For children aged 6 or older, the probability of meeting physical activity guidelines 7 out of 7 days is similar to published estimates. For pre-schoolers, the Betabinomial distribution yields higher estimates of adherence to the guidelines than the Beta distribution, in line with the probability of being active on any given day. In estimating the probability of adherence to physical activity guidelines, the Betabinomial distribution has several advantages over the previously used Beta distribution. It is a discrete distribution and maximizes the richness of accelerometer data.

  8. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    NASA Astrophysics Data System (ADS)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.

  9. Modelling the economic impact of three lameness causing diseases using herd and cow level evidence.

    PubMed

    Ettema, Jehan; Østergaard, Søren; Kristensen, Anders Ringgaard

    2010-06-01

    Diseases to the cow's hoof, interdigital skin and legs are highly prevalent and of large economic impact in modern dairy farming. In order to support farmer's decisions on preventing and treating lameness and its underlying causes, decision support models can be used to predict the economic profitability of such actions. An existing approach of modelling lameness as one health disorder in a dynamic, stochastic and mechanistic simulation model has been improved in two ways. First of all, three underlying diseases causing lameness were modelled: digital dermatitis, interdigital hyperplasia and claw horn diseases. Secondly, the existing simulation model was set-up in way that it uses hyper-distributions describing diseases risk of the three lameness causing diseases. By combining information on herd level risk factors with prevalence of lameness or prevalence of underlying diseases among cows, marginal posterior probability distributions for disease prevalence in the specific herd are created in a Bayesian network. Random draws from these distributions are used by the simulation model to describe disease risk. Hereby field data on prevalence is used systematically and uncertainty around herd specific risk is represented. Besides the fact that estimated profitability of halving disease risk depended on the hyper-distributions used, the estimates differed for herds with different levels of diseases risk and reproductive efficiency. (c) 2010 Elsevier B.V. All rights reserved.

  10. Marginal analysis in assessing factors contributing time to physician in the Emergency Department using operations data.

    PubMed

    Pathan, Sameer A; Bhutta, Zain A; Moinudheen, Jibin; Jenkins, Dominic; Silva, Ashwin D; Sharma, Yogdutt; Saleh, Warda A; Khudabakhsh, Zeenat; Irfan, Furqan B; Thomas, Stephen H

    2016-01-01

    Background: Standard Emergency Department (ED) operations goals include minimization of the time interval (tMD) between patients' initial ED presentation and initial physician evaluation. This study assessed factors known (or suspected) to influence tMD with a two-step goal. The first step was generation of a multivariate model identifying parameters associated with prolongation of tMD at a single study center. The second step was the use of a study center-specific multivariate tMD model as a basis for predictive marginal probability analysis; the marginal model allowed for prediction of the degree of ED operations benefit that would be affected with specific ED operations improvements. Methods: The study was conducted using one month (May 2015) of data obtained from an ED administrative database (EDAD) in an urban academic tertiary ED with an annual census of approximately 500,000; during the study month, the ED saw 39,593 cases. The EDAD data were used to generate a multivariate linear regression model assessing the various demographic and operational covariates' effects on the dependent variable tMD. Predictive marginal probability analysis was used to calculate the relative contributions of key covariates as well as demonstrate the likely tMD impact on modifying those covariates with operational improvements. Analyses were conducted with Stata 14MP, with significance defined at p  < 0.05 and confidence intervals (CIs) reported at the 95% level. Results: In an acceptable linear regression model that accounted for just over half of the overall variance in tMD (adjusted r 2 0.51), important contributors to tMD included shift census ( p  = 0.008), shift time of day ( p  = 0.002), and physician coverage n ( p  = 0.004). These strong associations remained even after adjusting for each other and other covariates. Marginal predictive probability analysis was used to predict the overall tMD impact (improvement from 50 to 43 minutes, p  < 0.001) of consistent staffing with 22 physicians. Conclusions: The analysis identified expected variables contributing to tMD with regression demonstrating significance and effect magnitude of alterations in covariates including patient census, shift time of day, and number of physicians. Marginal analysis provided operationally useful demonstration of the need to adjust physician coverage numbers, prompting changes at the study ED. The methods used in this analysis may prove useful in other EDs wishing to analyze operations information with the goal of predicting which interventions may have the most benefit.

  11. Quantification of the Water-Energy Nexus in Beijing City Based on Copula Analysis

    NASA Astrophysics Data System (ADS)

    Cai, J.; Cai, Y.

    2017-12-01

    Water resource and energy resource are intimately and highly interwoven, called ``water-energy nexus", which poses challenges for the sustainable management of water resource and energy resource. In this research, the Copula analysis method is first proposed to be applied in "water-energy nexus" field to clarify the internal relationship of water resource and energy resource, which is a favorable tool to explore the relevance among random variables. Beijing City, the capital of China, is chosen as a case study. The marginal distribution functions of water resource and energy resource are analyzed first. Then the Binary Copula function is employed to construct the joint distribution function of "water-energy nexus" to quantify the inherent relationship between water resource and energy resource. The results show that it is more appropriate to apply Lognormal distribution to establish the marginal distribution function of water resource. Meanwhile, Weibull distribution is more feasible to describe the marginal distribution function of energy resource. Furthermore, it is more suitable to adopt the Bivariate Normal Copula function to construct the joint distribution function of "water-energy nexus" in Beijing City. The findings can help to identify and quantify the "water-energy nexus". In addition, our findings can provide reasonable policy recommendations on the sustainable management of water resource and energy resource to promote regional coordinated development.

  12. Marginal and joint distributions of S100, HMB-45, and Melan-A across a large series of cutaneous melanomas.

    PubMed

    Viray, Hollis; Bradley, William R; Schalper, Kurt A; Rimm, David L; Gould Rothberg, Bonnie E

    2013-08-01

    The distribution of the standard melanoma antibodies S100, HMB-45, and Melan-A has been extensively studied. Yet, the overlap in their expression is less well characterized. To determine the joint distributions of the classic melanoma markers and to determine if classification according to joint antigen expression has prognostic relevance. S100, HMB-45, and Melan-A were assayed by immunofluorescence-based immunohistochemistry on a large tissue microarray of 212 cutaneous melanoma primary tumors and 341 metastases. Positive expression for each antigen required display of immunoreactivity for at least 25% of melanoma cells. Marginal and joint distributions were determined across all markers. Bivariate associations with established clinicopathologic covariates and melanoma-specific survival analyses were conducted. Of 322 assayable melanomas, 295 (91.6%), 203 (63.0%), and 236 (73.3%) stained with S100, HMB-45, and Melan-A, respectively. Twenty-seven melanomas, representing a diverse set of histopathologic profiles, were S100 negative. Coexpression of all 3 antibodies was observed in 160 melanomas (49.7%). Intensity of endogenous melanin pigment did not confound immunolabeling. Among primary tumors, associations with clinicopathologic parameters revealed a significant relationship only between HMB-45 and microsatellitosis (P = .02). No significant differences among clinicopathologic criteria were observed across the HMB-45/Melan-A joint distribution categories. Neither marginal HMB-45 (P = .56) nor Melan-A (P = .81), or their joint distributions (P = .88), was associated with melanoma-specific survival. Comprehensive characterization of the marginal and joint distributions for S100, HMB-45, and Melan-A across a large series of cutaneous melanomas revealed diversity of expression across this group of antigens. However, these immunohistochemically defined subclasses of melanomas do not significantly differ according to clinicopathologic correlates or outcome.

  13. Natural Gas Venting on the Northern Cascadia Margin

    NASA Astrophysics Data System (ADS)

    Scherwath, M.; Riedel, M.; Roemer, M.; Paull, C. K.; Spence, G.; Veloso, M.

    2016-12-01

    Over the past decades, hundreds of natural gas vents have been observed along the Northern Cascadia Margin in the Northeast Pacific, and we present a summary of these observations from offshore Vancouver Island, BC, Canada. We have gathered observed locations and analyzed original data from published literature as well as research cruises and fishing sonar from various archives. By far the highest accumulation of gas vent locations appear both shallow (100-200 m) and concentrated towards the mouth of the Juan de Fuca Strait, however these observations are naturally biased toward the distribution of the observation footprints. Normalized observations confirm the shallow high concentrations of gas vents but also establish some deeper sections of focused venting activity. We will speculate about the reasons behind the distribution, focus on specific examples, extrapolate for rough margin flux rate ranges and comment on short-comings and future directions for margin-wide gas vent studies.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smallwood, D.O.

    In a previous paper Smallwood and Paez (1991) showed how to generate realizations of partially coherent stationary normal time histories with a specified cross-spectral density matrix. This procedure is generalized for the case of multiple inputs with a specified cross-spectral density function and a specified marginal probability density function (pdf) for each of the inputs. The specified pdfs are not required to be Gaussian. A zero memory nonlinear (ZMNL) function is developed for each input to transform a Gaussian or normal time history into a time history with a specified non-Gaussian distribution. The transformation functions have the property that amore » transformed time history will have nearly the same auto spectral density as the original time history. A vector of Gaussian time histories are then generated with the specified cross-spectral density matrix. These waveforms are then transformed into the required time history realizations using the ZMNL function.« less

  15. Trend Detection and Bivariate Frequency Analysis for Nonstrationary Rainfall Data

    NASA Astrophysics Data System (ADS)

    Joo, K.; Kim, H.; Shin, J. Y.; Heo, J. H.

    2017-12-01

    Multivariate frequency analysis has been developing for hydro-meteorological data such as rainfall, flood, and drought. Particularly, the copula has been used as a useful tool for multivariate probability model which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition (IETD) and each rainfall event has a rainfall depth and rainfall duration. In addition, nonstationarity in rainfall event has been studied recently due to climate change and trend detection of rainfall event is important to determine the data has nonstationarity or not. With the rainfall depth and duration of a rainfall event, trend detection and nonstationary bivariate frequency analysis has performed in this study. 62 stations from Korea Meteorological Association (KMA) over 30 years of hourly recorded data used in this study and the suitability of nonstationary copula for rainfall event has examined by the goodness-of-fit test.

  16. Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE

    NASA Astrophysics Data System (ADS)

    Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas

    2016-04-01

    Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.

  17. Numerical detection of the Gardner transition in a mean-field glass former.

    PubMed

    Charbonneau, Patrick; Jin, Yuliang; Parisi, Giorgio; Rainone, Corrado; Seoane, Beatriz; Zamponi, Francesco

    2015-07-01

    Recent theoretical advances predict the existence, deep into the glass phase, of a novel phase transition, the so-called Gardner transition. This transition is associated with the emergence of a complex free energy landscape composed of many marginally stable sub-basins within a glass metabasin. In this study, we explore several methods to detect numerically the Gardner transition in a simple structural glass former, the infinite-range Mari-Kurchan model. The transition point is robustly located from three independent approaches: (i) the divergence of the characteristic relaxation time, (ii) the divergence of the caging susceptibility, and (iii) the abnormal tail in the probability distribution function of cage order parameters. We show that the numerical results are fully consistent with the theoretical expectation. The methods we propose may also be generalized to more realistic numerical models as well as to experimental systems.

  18. Does climate have heavy tails?

    NASA Astrophysics Data System (ADS)

    Bermejo, Miguel; Mudelsee, Manfred

    2013-04-01

    When we speak about a distribution with heavy tails, we are referring to the probability of the existence of extreme values will be relatively large. Several heavy-tail models are constructed from Poisson processes, which are the most tractable models. Among such processes, one of the most important are the Lévy processes, which are those process with independent, stationary increments and stochastic continuity. If the random component of a climate process that generates the data exhibits a heavy-tail distribution, and if that fact is ignored by assuming a finite-variance distribution, then there would be serious consequences (in the form, e.g., of bias) for the analysis of extreme values. Yet, it appears that it is an open question to what extent and degree climate data exhibit heavy-tail phenomena. We present a study about the statistical inference in the presence of heavy-tail distribution. In particular, we explore (1) the estimation of tail index of the marginal distribution using several estimation techniques (e.g., Hill estimator, Pickands estimator) and (2) the power of hypothesis tests. The performance of the different methods are compared using artificial time-series by means of Monte Carlo experiments. We systematically apply the heavy tail inference to observed climate data, in particular we focus on time series data. We study several proxy and directly observed climate variables from the instrumental period, the Holocene and the Pleistocene. This work receives financial support from the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme).

  19. The Effects of Rapid Sedimentation upon Continental Breakup: Kinematic and Thermal Modeling of the Salton Trough, Southern California, Based upon Recent Seismic Images

    NASA Astrophysics Data System (ADS)

    Han, L.; Hole, J. A.; Lowell, R. P.; Stock, J. M.; Fuis, G. S.

    2016-12-01

    The Salton Seismic Imaging Project (SSIP) illuminated crustal and upper mantle structure of the Salton Trough, the northern-most rift segment of the Gulf of California plate boundary. The crust is 17-18 km thick and homogeneous for 100 km in the plate motion direction. New crust is being created by distributed rift magmatism, Colorado River sedimentation, and metamorphism of the sediment. A 5 km thick pre-existing crustal layer may still exist. The crust has not broken apart to enable initiation of seafloor spreading. A one-dimensional time-dependent kinematic and thermal model was developed to simulate these observations. We assume that all crustal layers are stretched uniformly during extension. Distributed mafic magmatism and sedimentation are added simultaneously to compensate for the crustal thinning. The ratio of magmatism to sedimentation is constrained by the seismic observations. Heat is transported by thermal conduction and by advection due to stretching of the crust. A constant temperature boundary at the Moho is used to represent partial melting in the upper mantle. Assuming a constant plate motion rate, the zone of active rifting extends linearly with time. The crustal thickness and internal structure also evolve with time. The model constraints are the observed seismic structure and heat flow. The model rapidly reaches quasi-steady state, and could continue for many millions of years. The observed seismic structure and heat flow are reproduced after 3 Myr. The yield strength profile calculated from lithology and model temperature indicates that ductile deformation in the middle and lower crust dominates the crustal rheology. Rapid sedimentation delays crustal breakup and the initiation of seafloor spreading by maintaining the thickness of the crust and keeping it predominantly ductile. This process probably occurs wherever a large river flows into an active rift driven by far-field extension. It may have built passive margins in many locations globally, such as the Gulf of Mexico. This type of passive margin consists of mostly new crust created by magmatism and metamorphism of sediment. Along such margins, metamorphosed sediment could be misinterpreted as stretched pre-existing continental crust.

  20. Influence of margin segmentation and anomalous volcanism upon the break-up of the Hatton Bank rifted margin, west of the UK

    NASA Astrophysics Data System (ADS)

    Elliott, G. M.; Parson, L. M.

    2007-12-01

    The Hatton Bank margin, flanking the Iceland Basin is a widely cited example of a volcanic rifted margin. Prior to this study insights into the break-up history of the margin have been limited to profiles in the north and south, yet whilst valuable, the along margin tectono-magmatic variability has not been revealed. Over 5660 line km of high quality reflection seismic profiles with supplementary multibeam bathymetry were collected to support the UK's claim to Hatton region under the United Nations Convention on Law of the Sea (UNCLOS). Integration of this new data with existing profiles, allowed the margin to be divided into three segments, each of which are flanked by oceanic crust with a smooth upper surface and internal dipping reflectors. The southernmost segment is characterised by a series of inner and outer seaward dipping reflector (SDR) packages, which are separated by an outer high feature. The outer SDR are truncated by Endymion Spur, a chain of steep sided, late stage volcanic cones linked with necks. The central sector has no inner SDR package and is characterised by the presence of a highly intruded continental block, the Hatton Bank Block (HBB). The northern sector is adjacent to Lousy Bank, with a wider region of SDR recognised than to the south and a high amount of volcanic cones imaged. The variations in the distribution of the SDR's along the margin, the presence of the HBB and Endymion Spur all suggest that the break-up process was not uniform alongstrike. The division of the margin into three sectors reveals that structural segmentation played an important role in producing the variations along the margin. Break- up initiated in the south and progressed north producing the SDR packages witnessed, when the HBB was encountered the focus of break-up moved seaward of the block. The northern sector was closer to the Iceland Hotspot and hence a greater amount of volcanism is encountered. The smooth oceanic basement also indicates a high thermal flux leading to high melt production and subsidence rates forming the dipping reflectors. Shortly after break-up the eruption of Endymion Spur occurred. The nature of the magma erupted is unknown but from the steepness of the cones, it is inferred to be viscous and considering the setting, mostly likely a tholeiitic cumulate. A possible trigger for the Endymion Spur is the passage of a pulse of hotter than normal asthenospheric material along the margin, which interacted with lower crustal material to produce melt to feed the volcanic centres. Enhanced asthenospheric heat flow has been invoked to explain the V-shaped ridges along the present day Reykjanes Ridge and it is probable that the Endymion Spur represents previous such pulses along the margin/spreading axis. The location of the enhanced volcanism is itself controlled by crustal segmentation, with the Endymion Spur limited to the southern sector. The crustal thickness in this sector is approx. 2 to 3 km thinner than that found in the central segment, in which Endymion Spur is absent. The role of the segmentation along the margin has influenced the break-up style (presence or absence of SDR) and also the location and nature of post break-up volcanism.

  1. Aggregation of carbon dioxide sequestration storage assessment units

    USGS Publications Warehouse

    Blondes, Madalyn S.; Schuenemeyer, John H.; Olea, Ricardo A.; Drew, Lawrence J.

    2013-01-01

    The U.S. Geological Survey is currently conducting a national assessment of carbon dioxide (CO2) storage resources, mandated by the Energy Independence and Security Act of 2007. Pre-emission capture and storage of CO2 in subsurface saline formations is one potential method to reduce greenhouse gas emissions and the negative impact of global climate change. Like many large-scale resource assessments, the area under investigation is split into smaller, more manageable storage assessment units (SAUs), which must be aggregated with correctly propagated uncertainty to the basin, regional, and national scales. The aggregation methodology requires two types of data: marginal probability distributions of storage resource for each SAU, and a correlation matrix obtained by expert elicitation describing interdependencies between pairs of SAUs. Dependencies arise because geologic analogs, assessment methods, and assessors often overlap. The correlation matrix is used to induce rank correlation, using a Cholesky decomposition, among the empirical marginal distributions representing individually assessed SAUs. This manuscript presents a probabilistic aggregation method tailored to the correlations and dependencies inherent to a CO2 storage assessment. Aggregation results must be presented at the basin, regional, and national scales. A single stage approach, in which one large correlation matrix is defined and subsets are used for different scales, is compared to a multiple stage approach, in which new correlation matrices are created to aggregate intermediate results. Although the single-stage approach requires determination of significantly more correlation coefficients, it captures geologic dependencies among similar units in different basins and it is less sensitive to fluctuations in low correlation coefficients than the multiple stage approach. Thus, subsets of one single-stage correlation matrix are used to aggregate to basin, regional, and national scales.

  2. Dose Distribution in Bladder and Surrounding Normal Tissues in Relation to Bladder Volume in Conformal Radiotherapy for Bladder Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Majewski, Wojciech, E-mail: wmajewski1@poczta.onet.p; Wesolowska, Iwona; Urbanczyk, Hubert

    2009-12-01

    Purpose: To estimate bladder movements and changes in dose distribution in the bladder and surrounding tissues associated with changes in bladder filling and to estimate the internal treatment margins. Methods and Materials: A total of 16 patients with bladder cancer underwent planning computed tomography scans with 80- and 150-mL bladder volumes. The bladder displacements associated with the change in volume were measured. Each patient had treatment plans constructed for a 'partially empty' (80 mL) and a 'partially full' (150 mL) bladder. An additional plan was constructed for tumor irradiation alone. A subsequent 9 patients underwent sequential weekly computed tomography scanningmore » during radiotherapy to verify the bladder movements and estimate the internal margins. Results: Bladder movements were mainly observed cranially, and the estimated internal margins were nonuniform and largest (>2 cm) anteriorly and cranially. The dose distribution in the bladder worsened if the bladder increased in volume: 70% of patients (11 of 16) would have had bladder underdosed to <95% of the prescribed dose. The dose distribution in the rectum and intestines was better with a 'partially empty' bladder (volume that received >70%, 80%, and 90% of the prescribed dose was 23%, 20%, and 15% for the rectum and 162, 144, 123 cm{sup 3} for the intestines, respectively) than with a 'partially full' bladder (volume that received >70%, 80%, and 90% of the prescribed dose was 28%, 24%, and 18% for the rectum and 180, 158, 136 cm{sup 3} for the intestines, respectively). The change in bladder filling during RT was significant for the dose distribution in the intestines. Tumor irradiation alone was significantly better than whole bladder irradiation in terms of organ sparing. Conclusion: The displacements of the bladder due to volume changes were mainly related to the upper wall. The internal margins should be nonuniform, with the largest margins cranially and anteriorly. The changes in bladder filling during RT could influence the dose distribution in the bladder and intestines. The dose distribution in the rectum and bowel was slightly better with a 'partially empty' than with a 'full' bladder.« less

  3. Frequency-Magnitude relationships for Underwater Landslides of the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Urgeles, R.; Gràcia, E.; Lo Iacono, C.; Sànchez-Serra, C.; Løvholt, F.

    2017-12-01

    An updated version of the submarine landslide database of the Mediterranean Sea contains 955 MTDs and 2608 failure scars showing that submarine landslides are ubiquitous features along Mediterranean continental margins. Their distribution reveals that major deltaic wedges display the larger submarine landslides, while seismically active margins are characterized by relatively small failures. In all regions, landslide size distributions display power law scaling for landslides > 1 km3. We find consistent differences on the exponent of the power law depending on the geodynamic setting. Active margins present steep slopes of the frequency-magnitude relationship whereas passive margins tend to display gentler slopes. This pattern likely responds to the common view that tectonically active margins have numerous but small failures, while passive margins have larger but fewer failures. Available age information suggests that failures exceeding 1000 km3 are infrequent and may recur every 40 kyr. Smaller failures that can still cause significant damage might be relatively frequent, with failures > 1 km3 likely recurring every 40 years. The database highlights that our knowledge of submarine landslide activity with time is limited to a few tens of thousand years. Available data suggest that submarine landslides may preferentially occur during lowstand periods, but no firm conclusion can be made on this respect, as only 149 landslides (out of 955 included in the database) have relatively accurate age determinations. The timing and regional changes in the frequency-magnitude distribution suggest that sedimentation patterns and pore pressure development have had a major role in triggering slope failures and control the sediment flux from mass wasting to the deep basin.

  4. Triton Southern Hemisphere

    NASA Image and Video Library

    1998-06-08

    This polar projection from NASA Voyager 2 of Triton southern hemisphere provides a view of the southern polar cap and bright equatorial fringe. The margin of the cap is scalloped and ranges in latitude from +10 degrees to -30 degrees. The bright fringe is closely associated with the cap's margin; from it, diffuse bright rays extend north-northeast for hundreds of kilometers. The bright fringe probably consists of very fresh nitrogen frost or snow, and the rays consist of bright-fringe materials that were redistributed by north-moving Coriolis-deflected winds. http://photojournal.jpl.nasa.gov/catalog/PIA00423

  5. Bayesian modeling and inference for diagnostic accuracy and probability of disease based on multiple diagnostic biomarkers with and without a perfect reference standard.

    PubMed

    Jafarzadeh, S Reza; Johnson, Wesley O; Gardner, Ian A

    2016-03-15

    The area under the receiver operating characteristic (ROC) curve (AUC) is used as a performance metric for quantitative tests. Although multiple biomarkers may be available for diagnostic or screening purposes, diagnostic accuracy is often assessed individually rather than in combination. In this paper, we consider the interesting problem of combining multiple biomarkers for use in a single diagnostic criterion with the goal of improving the diagnostic accuracy above that of an individual biomarker. The diagnostic criterion created from multiple biomarkers is based on the predictive probability of disease, conditional on given multiple biomarker outcomes. If the computed predictive probability exceeds a specified cutoff, the corresponding subject is allocated as 'diseased'. This defines a standard diagnostic criterion that has its own ROC curve, namely, the combined ROC (cROC). The AUC metric for cROC, namely, the combined AUC (cAUC), is used to compare the predictive criterion based on multiple biomarkers to one based on fewer biomarkers. A multivariate random-effects model is proposed for modeling multiple normally distributed dependent scores. Bayesian methods for estimating ROC curves and corresponding (marginal) AUCs are developed when a perfect reference standard is not available. In addition, cAUCs are computed to compare the accuracy of different combinations of biomarkers for diagnosis. The methods are evaluated using simulations and are applied to data for Johne's disease (paratuberculosis) in cattle. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  7. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  8. Herbivory at marginal populations: Consequences for maternal fitness and vegetative differentiation

    NASA Astrophysics Data System (ADS)

    Castilla, Antonio R.; Alonso, Conchita; Herrera, Carlos M.

    2013-05-01

    Margins of distribution of plant species constitute natural areas where the impact of the antagonistic interactions is expected to be higher and where changes in the dynamics of plant-herbivore coevolution could promote intraspecific differentiation in (co)evolving plant traits. In the present study, we investigated how differences in the average herbivory level affect maternal fitness in core continuous and marginal disjunct populations of Daphne laureola in an effort to assess the role of herbivores limiting plant distribution. Furthermore, we investigated intraspecific differentiation in vegetative traits and their potential connection to divergent selection by herbivores in both groups of populations. Our results did not support increased herbivory at the species margin but did support a difference in the effect of herbivory on maternal fitness between core continuous and marginal disjunct populations of D. laureola. In addition, herbivores did not exert phenotypic selection consistent with the geographic variation in studied plant traits. Therefore, the geographic variation of vegetative traits of D. laureola seems to be consequence of environmental heterogeneity more than result of geographically divergent selection by herbivores.

  9. Spatial-temporal variation of marginal land suitable for energy plants from 1990 to 2010 in China

    NASA Astrophysics Data System (ADS)

    Jiang, Dong; Hao, Mengmeng; Fu, Jingying; Zhuang, Dafang; Huang, Yaohuan

    2014-07-01

    Energy plants are the main source of bioenergy which will play an increasingly important role in future energy supplies. With limited cultivated land resources in China, the development of energy plants may primarily rely on the marginal land. In this study, based on the land use data from 1990 to 2010(every 5 years is a period) and other auxiliary data, the distribution of marginal land suitable for energy plants was determined using multi-factors integrated assessment method. The variation of land use type and spatial distribution of marginal land suitable for energy plants of different decades were analyzed. The results indicate that the total amount of marginal land suitable for energy plants decreased from 136.501 million ha to 114.225 million ha from 1990 to 2010. The reduced land use types are primarily shrub land, sparse forest land, moderate dense grassland and sparse grassland, and large variation areas are located in Guangxi, Tibet, Heilongjiang, Xinjiang and Inner Mongolia. The results of this study will provide more effective data reference and decision making support for the long-term planning of bioenergy resources.

  10. Random Partition Distribution Indexed by Pairwise Information

    PubMed Central

    Dahl, David B.; Day, Ryan; Tsai, Jerry W.

    2017-01-01

    We propose a random partition distribution indexed by pairwise similarity information such that partitions compatible with the similarities are given more probability. The use of pairwise similarities, in the form of distances, is common in some clustering algorithms (e.g., hierarchical clustering), but we show how to use this type of information to define a prior partition distribution for flexible Bayesian modeling. A defining feature of the distribution is that it allocates probability among partitions within a given number of subsets, but it does not shift probability among sets of partitions with different numbers of subsets. Our distribution places more probability on partitions that group similar items yet keeps the total probability of partitions with a given number of subsets constant. The distribution of the number of subsets (and its moments) is available in closed-form and is not a function of the similarities. Our formulation has an explicit probability mass function (with a tractable normalizing constant) so the full suite of MCMC methods may be used for posterior inference. We compare our distribution with several existing partition distributions, showing that our formulation has attractive properties. We provide three demonstrations to highlight the features and relative performance of our distribution. PMID:29276318

  11. Ice Water Classification Using Statistical Distribution Based Conditional Random Fields in RADARSAT-2 Dual Polarization Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.

    2017-09-01

    In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.

  12. Anthropogenic activities have contributed moderately to increased inputs of organic materials in marginal seas off China.

    PubMed

    Liu, Liang-Ying; Wei, Gao-Ling; Wang, Ji-Zhong; Guan, Yu-Feng; Wong, Charles S; Wu, Feng-Chang; Zeng, Eddy Y

    2013-10-15

    Sediment has been recognized as a gigantic sink of organic materials and therefore can record temporal input trends. To examine the impact of anthropogenic activities on the marginal seas off China, sediment cores were collected from the Yellow Sea, the inner shelf of the East China Sea (ECS), and the South China Sea (SCS) to investigate the sources and spatial and temporal variations of organic materials, i.e., total organic carbon (TOC) and aliphatic hydrocarbons. The concentration ranges of TOC were 0.5-1.29, 0.63-0.83, and 0.33-0.85%, while those of Σn-C14-35 (sum of n-alkanes with carbon numbers of 14-35) were 0.08-1.5, 0.13-1.97, and 0.35-0.96 μg/g dry weight in sediment cores from the Yellow Sea, ECS inner shelf, and the SCS, respectively. Terrestrial higher plants were an important source of aliphatic hydrocarbons in marine sediments off China. The spatial distribution of Σn-C14-35 concentrations and source diagnostic ratios suggested a greater load of terrestrial organic materials in the Yellow Sea than in the ECS and SCS. Temporally, TOC and Σn-C14-35 concentrations increased with time and peaked at either the surface or immediate subsurface layers. This increase was probably reflective of elevated inputs of organic materials to marginal seas off China in recent years, and attributed partly to the impacts of intensified anthropogenic activities in mainland China. Source diagnostics also suggested that aliphatic hydrocarbons were mainly derived from biogenic sources, with a minority in surface sediment layers from petroleum sources, consistent with the above-mentioned postulation.

  13. Methane Metabolizing Microbial Communities in the Cold Seep Areas in the Northern Continental Shelf of South China Sea

    NASA Astrophysics Data System (ADS)

    Wang, F.; Liang, Q.

    2016-12-01

    Marine sediment contains large amount of methane, estimated approximately 500-2500 gigatonnes of dissolved and hydrated methane carbon stored therein, mainly in continental margins. In localized specific areas named cold seeps, hydrocarbon (mainly methane) containing fluids rise to the seafloor, and support oases of ecosystem composed of various microorganisms and faunal assemblages. South China Sea (SCS) is surrounded by passive continental margins in the west and north and convergent margins in the south and east. Thick organic-rich sediments have accumulated in the SCS since the late Mesozoic, which are continuing sources to form gas hydrates in the sediments of SCS. Here, Microbial ecosystems, particularly those involved in methane transformations were investigated in the cold seep areas (Qiongdongnan, Shenhu, and Dongsha) in the northern continental shelf of SCS. Multiple interdisciplinary analytic tools such as stable isotope probing, geochemical analysis, and molecular ecology, were applied for a comprehensive understanding of the microbe mediated methane transformation in this project. A variety of sediments cores have been collected, the geochemical profiles and the associated microbial distribution along the sediment cores were recorded. The major microbial groups involved in the methane transformation in these sediment cores were revealed, known methane producing and oxidizing archaea including Methanosarcinales, anaerobic methane oxidizing groups ANME-1, ANME-2 and their niche preference in the SCS sediments were found. In-depth comparative analysis revealed the presence of SCS-specific archaeal subtypes which probably reflected the evolution and adaptation of these methane metabolizing microbes to the SCS environmental conditions. Our work represents the first comprehensive analysis of the methane metabolizing microbial communities in the cold seep areas along the northern continental shelf of South China Sea, would provide new insight into the mechanisms of methane biotransformation.

  14. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  15. Bayesian analysis of the flutter margin method in aeroelasticity

    DOE PAGES

    Khalil, Mohammad; Poirel, Dominique; Sarkar, Abhijit

    2016-08-27

    A Bayesian statistical framework is presented for Zimmerman and Weissenburger flutter margin method which considers the uncertainties in aeroelastic modal parameters. The proposed methodology overcomes the limitations of the previously developed least-square based estimation technique which relies on the Gaussian approximation of the flutter margin probability density function (pdf). Using the measured free-decay responses at subcritical (preflutter) airspeeds, the joint non-Gaussain posterior pdf of the modal parameters is sampled using the Metropolis–Hastings (MH) Markov chain Monte Carlo (MCMC) algorithm. The posterior MCMC samples of the modal parameters are then used to obtain the flutter margin pdfs and finally the fluttermore » speed pdf. The usefulness of the Bayesian flutter margin method is demonstrated using synthetic data generated from a two-degree-of-freedom pitch-plunge aeroelastic model. The robustness of the statistical framework is demonstrated using different sets of measurement data. In conclusion, it will be shown that the probabilistic (Bayesian) approach reduces the number of test points required in providing a flutter speed estimate for a given accuracy and precision.« less

  16. Demonstration of a Probabilistic Technique for the Determination of Economic Viability of Very Large Transport Configurations

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    1998-01-01

    Over the past few years, modem aircraft design has experienced a paradigm shift from designing for performance to designing for affordability. This report contains a probabilistic approach that will allow traditional deterministic design methods to be extended to account for disciplinary, economic, and technological uncertainty. The probabilistic approach was facilitated by the Fast Probability Integration (FPI) technique; a technique which allows the designer to gather valuable information about the vehicle's behavior in the design space. This technique is efficient for assessing multi-attribute, multi-constraint problems in a more realistic fashion. For implementation purposes, this technique is applied to illustrate how both economic and technological uncertainty associated with a Very Large Transport aircraft concept may be assessed. The assessment is evaluated with the FPI technique to determine the cumulative probability distributions of the design space, as bound by economic objectives and performance constraints. These distributions were compared to established targets for a comparable large capacity aircraft, similar in size to the Boeing 747-400. The conventional baseline configuration design space was determined to be unfeasible and marginally viable, motivating the infusion of advanced technologies, including reductions in drag, specific fuel consumption, wing weight, and Research, Development, Testing, and Evaluation costs. The resulting system design space was qualitatively assessed with technology metric "k" factors. The infusion of technologies shifted the VLT design into regions of feasibility and greater viability. The study also demonstrated a method and relationship by which the impact of new technologies may be assessed in a more system focused approach.

  17. Analysis of the trajectory of Drosophila melanogaster in a circular open field arena.

    PubMed

    Valente, Dan; Golani, Ilan; Mitra, Partha P

    2007-10-24

    Obtaining a complete phenotypic characterization of a freely moving organism is a difficult task, yet such a description is desired in many neuroethological studies. Many metrics currently used in the literature to describe locomotor and exploratory behavior are typically based on average quantities or subjectively chosen spatial and temporal thresholds. All of these measures are relatively coarse-grained in the time domain. It is advantageous, however, to employ metrics based on the entire trajectory that an organism takes while exploring its environment. To characterize the locomotor behavior of Drosophila melanogaster, we used a video tracking system to record the trajectory of a single fly walking in a circular open field arena. The fly was tracked for two hours. Here, we present techniques with which to analyze the motion of the fly in this paradigm, and we discuss the methods of calculation. The measures we introduce are based on spatial and temporal probability distributions and utilize the entire time-series trajectory of the fly, thus emphasizing the dynamic nature of locomotor behavior. Marginal and joint probability distributions of speed, position, segment duration, path curvature, and reorientation angle are examined and related to the observed behavior. The measures discussed in this paper provide a detailed profile of the behavior of a single fly and highlight the interaction of the fly with the environment. Such measures may serve as useful tools in any behavioral study in which the movement of a fly is an important variable and can be incorporated easily into many setups, facilitating high-throughput phenotypic characterization.

  18. Temperate Mountain Forest Biodiversity under Climate Change: Compensating Negative Effects by Increasing Structural Complexity

    PubMed Central

    Braunisch, Veronika; Coppes, Joy; Arlettaz, Raphaël; Suchant, Rudi; Zellweger, Florian; Bollmann, Kurt

    2014-01-01

    Species adapted to cold-climatic mountain environments are expected to face a high risk of range contractions, if not local extinctions under climate change. Yet, the populations of many endothermic species may not be primarily affected by physiological constraints, but indirectly by climate-induced changes of habitat characteristics. In mountain forests, where vertebrate species largely depend on vegetation composition and structure, deteriorating habitat suitability may thus be mitigated or even compensated by habitat management aiming at compositional and structural enhancement. We tested this possibility using four cold-adapted bird species with complementary habitat requirements as model organisms. Based on species data and environmental information collected in 300 1-km2 grid cells distributed across four mountain ranges in central Europe, we investigated (1) how species’ occurrence is explained by climate, landscape, and vegetation, (2) to what extent climate change and climate-induced vegetation changes will affect habitat suitability, and (3) whether these changes could be compensated by adaptive habitat management. Species presence was modelled as a function of climate, landscape and vegetation variables under current climate; moreover, vegetation-climate relationships were assessed. The models were extrapolated to the climatic conditions of 2050, assuming the moderate IPCC-scenario A1B, and changes in species’ occurrence probability were quantified. Finally, we assessed the maximum increase in occurrence probability that could be achieved by modifying one or multiple vegetation variables under altered climate conditions. Climate variables contributed significantly to explaining species occurrence, and expected climatic changes, as well as climate-induced vegetation trends, decreased the occurrence probability of all four species, particularly at the low-altitudinal margins of their distribution. These effects could be partly compensated by modifying single vegetation factors, but full compensation would only be achieved if several factors were changed in concert. The results illustrate the possibilities and limitations of adaptive species conservation management under climate change. PMID:24823495

  19. Temperate mountain forest biodiversity under climate change: compensating negative effects by increasing structural complexity.

    PubMed

    Braunisch, Veronika; Coppes, Joy; Arlettaz, Raphaël; Suchant, Rudi; Zellweger, Florian; Bollmann, Kurt

    2014-01-01

    Species adapted to cold-climatic mountain environments are expected to face a high risk of range contractions, if not local extinctions under climate change. Yet, the populations of many endothermic species may not be primarily affected by physiological constraints, but indirectly by climate-induced changes of habitat characteristics. In mountain forests, where vertebrate species largely depend on vegetation composition and structure, deteriorating habitat suitability may thus be mitigated or even compensated by habitat management aiming at compositional and structural enhancement. We tested this possibility using four cold-adapted bird species with complementary habitat requirements as model organisms. Based on species data and environmental information collected in 300 1-km2 grid cells distributed across four mountain ranges in central Europe, we investigated (1) how species' occurrence is explained by climate, landscape, and vegetation, (2) to what extent climate change and climate-induced vegetation changes will affect habitat suitability, and (3) whether these changes could be compensated by adaptive habitat management. Species presence was modelled as a function of climate, landscape and vegetation variables under current climate; moreover, vegetation-climate relationships were assessed. The models were extrapolated to the climatic conditions of 2050, assuming the moderate IPCC-scenario A1B, and changes in species' occurrence probability were quantified. Finally, we assessed the maximum increase in occurrence probability that could be achieved by modifying one or multiple vegetation variables under altered climate conditions. Climate variables contributed significantly to explaining species occurrence, and expected climatic changes, as well as climate-induced vegetation trends, decreased the occurrence probability of all four species, particularly at the low-altitudinal margins of their distribution. These effects could be partly compensated by modifying single vegetation factors, but full compensation would only be achieved if several factors were changed in concert. The results illustrate the possibilities and limitations of adaptive species conservation management under climate change.

  20. Wave-Ice interaction in the Marginal Ice Zone: Toward a Wave-Ocean-Ice Coupled Modeling System

    DTIC Science & Technology

    2015-09-30

    MIZ using WW3 (3 frequency bins, ice retreat in August and ice advance in October); Blue (solid): Based on observations near Antarctica by Meylan...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Wave- Ice interaction in the Marginal Ice Zone: Toward a...Wave-Ocean- Ice Coupled Modeling System W. E. Rogers Naval Research Laboratory, Code 7322 Stennis Space Center, MS 39529 phone: (228) 688-4727

  1. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    PubMed

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  2. Probabilistic pipe fracture evaluations for leak-rate-detection applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, S.; Ghadiali, N.; Paul, D.

    1995-04-01

    Regulatory Guide 1.45, {open_quotes}Reactor Coolant Pressure Boundary Leakage Detection Systems,{close_quotes} was published by the U.S. Nuclear Regulatory Commission (NRC) in May 1973, and provides guidance on leak detection methods and system requirements for Light Water Reactors. Additionally, leak detection limits are specified in plant Technical Specifications and are different for Boiling Water Reactors (BWRs) and Pressurized Water Reactors (PWRs). These leak detection limits are also used in leak-before-break evaluations performed in accordance with Draft Standard Review Plan, Section 3.6.3, {open_quotes}Leak Before Break Evaluation Procedures{close_quotes} where a margin of 10 on the leak detection limit is used in determining the crackmore » size considered in subsequent fracture analyses. This study was requested by the NRC to: (1) evaluate the conditional failure probability for BWR and PWR piping for pipes that were leaking at the allowable leak detection limit, and (2) evaluate the margin of 10 to determine if it was unnecessarily large. A probabilistic approach was undertaken to conduct fracture evaluations of circumferentially cracked pipes for leak-rate-detection applications. Sixteen nuclear piping systems in BWR and PWR plants were analyzed to evaluate conditional failure probability and effects of crack-morphology variability on the current margins used in leak rate detection for leak-before-break.« less

  3. The global impact distribution of Near-Earth objects

    NASA Astrophysics Data System (ADS)

    Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.

    2016-02-01

    Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.

  4. Marginal Contribution-Based Distributed Subchannel Allocation in Small Cell Networks.

    PubMed

    Shah, Shashi; Kittipiyakul, Somsak; Lim, Yuto; Tan, Yasuo

    2018-05-10

    The paper presents a game theoretic solution for distributed subchannel allocation problem in small cell networks (SCNs) analyzed under the physical interference model. The objective is to find a distributed solution that maximizes the welfare of the SCNs, defined as the total system capacity. Although the problem can be addressed through best-response (BR) dynamics, the existence of a steady-state solution, i.e., a pure strategy Nash equilibrium (NE), cannot be guaranteed. Potential games (PGs) ensure convergence to a pure strategy NE when players rationally play according to some specified learning rules. However, such a performance guarantee comes at the expense of complete knowledge of the SCNs. To overcome such requirements, properties of PGs are exploited for scalable implementations, where we utilize the concept of marginal contribution (MC) as a tool to design learning rules of players’ utility and propose the marginal contribution-based best-response (MCBR) algorithm of low computational complexity for the distributed subchannel allocation problem. Finally, we validate and evaluate the proposed scheme through simulations for various performance metrics.

  5. Role of local to regional-scale collisions in the closure history of the Southern Neotethys, exemplified by tectonic development of the Kyrenia Range active margin/collisional lineament, N Cyprus

    NASA Astrophysics Data System (ADS)

    Robertson, Alastair; Kinnaird, Tim; McCay, Gillian; Palamakumbura, Romesh; Chen, Guohui

    2016-04-01

    Active margin processes including subduction, accretion, arc magmatism and back-arc extension play a key role in the diachronous, and still incomplete closure of the S Neotethys. The S Neotethys rifted along the present-day Africa-Eurasia continental margin during the Late Triassic and, after sea-floor spreading, began to close related to northward subduction during the Late Cretaceous. The northern, active continental margin of the S Neotethys was bordered by several of the originally rifted continental fragments (e.g. Taurides). The present-day convergent lineament ranges from subaqueous (e.g. Mediterranean Ridge), to subaerial (e.g. SE Turkey). The active margin development is partially obscured by microcontinent-continent collision and post-collisional strike-slip deformation (e.g. Tauride-Arabian suture). However, the Kyrenia Range, N Cyprus provides an outstanding record of convergent margin to early stage collisional processes. It owes its existence to strong localised uplift during the Pleistocene, which probably resulted from the collision of a continental promontory of N Africa (Eratosthenes Seamount) with the long-lived S Neotethyan active margin to the north. A multi-stage convergence history is revealed, mainly from a combination of field structural, sedimentological and igneous geochemical studies. Initial Late Cretaceous convergence resulted in greenschist facies burial metamorphism that is likely to have been related to the collision, then rapid exhumation, of a continental fragment (stage 1). During the latest Cretaceous-Palaeogene, the Kyrenia lineament was characterised by subduction-influenced magmatism and syn-tectonic sediment deposition. Early to Mid-Eocene, S-directed thrusting and folding (stage 2) is likely to have been influenced by the suturing of the Izmir-Ankara-Erzincan ocean to the north ('N Neotethys'). Convergence continued during the Neogene, dominated by deep-water terrigenous gravity-flow accumulation in a foredeep setting. Further S-directed compression took place during Late Miocene-earliest Pliocene (stage 3) in an oblique left-lateral stress regime, probably influenced by the collision of the Tauride and Arabian continents to the east. Strong uplift of the active margin lineament then took place during the Pleistocene, related to incipient continental collision (stage 4). The uplift is documented by a downward-younging flight of marine and continental terrace deposits on both flanks of the Kyrenia Range. The geological record of the S Neotethyan active continental margin, based on regional to global plate kinematic reconstructions, appears to have been dominated by on-going convergence (with possible temporal changes), punctuated by the effects of relatively local to regional-scale collisional events. Similar processes are likely to have affected other S Neotethyan segments and other convergent margins.

  6. Surface current patterns suggested by suspended sediment distribution over the outer continental margin, Bering Sea

    USGS Publications Warehouse

    Karl, Herman A.; Carlson, P.R.

    1987-01-01

    Samples of total suspended matter (TSM) were collected at the surface over the northern outer continental margin of the Bering Sea during the summers of 1980 and 1981. Volume concentrations of surface TSM averaged 0.6 and 1.1 mg l-1 for 1980 and 1981, respectively. Organic matter, largely plankton, made up about 65% of the near-surface TSM for both years. Distributions of TSM suggested that shelf circulation patterns were characterized either by meso- and large- scale eddies or by cross-shelf components of flow superimposed on a general northwesterly net drift. These patterns may be caused by large submarine canyons which dominate the physiography of this part of the Bering Sea continental margin. ?? 1987.

  7. Tectonics of some Amazonian greenstone belts

    NASA Technical Reports Server (NTRS)

    Gibbs, A. K.

    1986-01-01

    Greenstone belts exposed amid gneisses, granitoid rocks, and less abundant granulites along the northern and eastern margins of the Amazonian Craton yield Trans-Amazonican metamorphic ages of 2.0-2.1 Ga. Early proterozoic belts in the northern region probably originated as ensimatic island arc complexes. The Archean Carajas belt in the southeastern craton probably formed in an extensional basin on older continental basement. That basement contains older Archean belts with pillow basalts and komatiites. Belts of ultramafic rocks warrant investigatijon as possible ophiolites. A discussion follows.

  8. Robust Statistics and Regularization for Feature Extraction and UXO Discrimination

    DTIC Science & Technology

    2011-07-01

    July 11, 2011 real data we find that this technique has an improved probability of finding all ordnance in a test data set, relative to previously...many sites. Tests on larger data sets should still be carried out. In previous work we considered a bootstrapping approach to selecting the operating...Marginalizing over x we obtain the probability that the ith order statistic in the test data belongs to the T class (55) P (T |x(i)) = ∞∫ −∞ P (T |x)p(x

  9. PTV margin determination in conformal SRT of intracranial lesions

    PubMed Central

    Parker, Brent C.; Shiu, Almon S.; Maor, Moshe H.; Lang, Frederick F.; Liu, H. Helen; White, R. Allen; Antolak, John A.

    2002-01-01

    The planning target volume (PTV) includes the clinical target volume (CTV) to be irradiated and a margin to account for uncertainties in the treatment process. Uncertainties in miniature multileaf collimator (mMLC) leaf positioning, CT scanner spatial localization, CT‐MRI image fusion spatial localization, and Gill‐Thomas‐Cosman (GTC) relocatable head frame repositioning were quantified for the purpose of determining a minimum PTV margin that still delivers a satisfactory CTV dose. The measured uncertainties were then incorporated into a simple Monte Carlo calculation for evaluation of various margin and fraction combinations. Satisfactory CTV dosimetric criteria were selected to be a minimum CTV dose of 95% of the PTV dose and at least 95% of the CTV receiving 100% of the PTV dose. The measured uncertainties were assumed to be Gaussian distributions. Systematic errors were added linearly and random errors were added in quadrature assuming no correlation to arrive at the total combined error. The Monte Carlo simulation written for this work examined the distribution of cumulative dose volume histograms for a large patient population using various margin and fraction combinations to determine the smallest margin required to meet the established criteria. The program examined 5 and 30 fraction treatments, since those are the only fractionation schemes currently used at our institution. The fractionation schemes were evaluated using no margin, a margin of just the systematic component of the total uncertainty, and a margin of the systematic component plus one standard deviation of the total uncertainty. It was concluded that (i) a margin of the systematic error plus one standard deviation of the total uncertainty is the smallest PTV margin necessary to achieve the established CTV dose criteria, and (ii) it is necessary to determine the uncertainties introduced by the specific equipment and procedures used at each institution since the uncertainties may vary among locations. PACS number(s): 87.53.Kn, 87.53.Ly PMID:12132939

  10. Positive contraction mappings for classical and quantum Schrödinger systems

    NASA Astrophysics Data System (ADS)

    Georgiou, Tryphon T.; Pavon, Michele

    2015-03-01

    The classical Schrödinger bridge seeks the most likely probability law for a diffusion process, in path space, that matches marginals at two end points in time; the likelihood is quantified by the relative entropy between the sought law and a prior. Jamison proved that the new law is obtained through a multiplicative functional transformation of the prior. This transformation is characterised by an automorphism on the space of endpoints probability measures, which has been studied by Fortet, Beurling, and others. A similar question can be raised for processes evolving in a discrete time and space as well as for processes defined over non-commutative probability spaces. The present paper builds on earlier work by Pavon and Ticozzi and begins by establishing solutions to Schrödinger systems for Markov chains. Our approach is based on the Hilbert metric and shows that the solution to the Schrödinger bridge is provided by the fixed point of a contractive map. We approach, in a similar manner, the steering of a quantum system across a quantum channel. We are able to establish existence of quantum transitions that are multiplicative functional transformations of a given Kraus map for the cases where the marginals are either uniform or pure states. As in the Markov chain case, and for uniform density matrices, the solution of the quantum bridge can be constructed from the fixed point of a certain contractive map. For arbitrary marginal densities, extensive numerical simulations indicate that iteration of a similar map leads to fixed points from which we can construct a quantum bridge. For this general case, however, a proof of convergence remains elusive.

  11. Radar Remote Sensing of Ice and Sea State and Air-Sea Interaction in the Marginal Ice Zone

    DTIC Science & Technology

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Radar Remote Sensing of Ice and Sea State and Air-Sea...Interaction in the Marginal Ice Zone Hans C. Graber RSMAS – Department of Ocean Sciences Center for Southeastern Tropical Advanced Remote Sensing...scattering and attenuation process of ocean waves interacting with ice . A nautical X-band radar on a vessel dedicated to science would be used to follow the

  12. Utilization of cone-beam CT for offline evaluation of target volume coverage during prostate image-guided radiotherapy based on bony anatomy alignment.

    PubMed

    Paluska, Petr; Hanus, Josef; Sefrova, Jana; Rouskova, Lucie; Grepl, Jakub; Jansa, Jan; Kasaova, Linda; Hodek, Miroslav; Zouhar, Milan; Vosmik, Milan; Petera, Jiri

    2012-01-01

    To assess target volume coverage during prostate image-guided radiotherapy based on bony anatomy alignment and to assess possibility of safety margin reduction. Implementation of IGRT should influence safety margins. Utilization of cone-beam CT provides current 3D anatomic information directly in irradiation position. Such information enables reconstruction of the actual dose distribution. Seventeen prostate patients were treated with daily bony anatomy image-guidance. Cone-beam CT (CBCT) scans were acquired once a week immediately after bony anatomy alignment. After the prostate, seminal vesicles, rectum and bladder were contoured, the delivered dose distribution was reconstructed. Target dose coverage was evaluated by the proportion of the CTV encompassed by the 95% isodose. Original plans employed a 1 cm safety margin. Alternative plans assuming a smaller 7 mm margin between CTV and PTV were evaluated in the same way. Rectal and bladder volumes were compared with the initial ones. Rectal and bladder volumes irradiated with doses higher than 75 Gy, 70 Gy, 60 Gy, 50 Gy and 40 Gy were analyzed. In 12% of reconstructed plans the prostate coverage was not sufficient. The prostate underdosage was observed in 5 patients. Coverage of seminal vesicles was not satisfactory in 3% of plans. Most of the target underdosage corresponded to excessive rectal or bladder filling. Evaluation of alternative plans assuming a smaller 7 mm margin revealed 22% and 11% of plans where prostate and seminal vesicles coverage, respectively, was compromised. These were distributed over 8 and 7 patients, respectively. Sufficient dose coverage of target volumes was not achieved for all patients. Reducing of safety margin is not acceptable. Initial rectal and bladder volumes cannot be considered representative for subsequent treatment.

  13. Stochastic mechanics of reciprocal diffusions

    NASA Astrophysics Data System (ADS)

    Levy, Bernard C.; Krener, Arthur J.

    1996-02-01

    The dynamics and kinematics of reciprocal diffusions were examined in a previous paper [J. Math. Phys. 34, 1846 (1993)], where it was shown that reciprocal diffusions admit a chain of conservation laws, which close after the first two laws for two disjoint subclasses of reciprocal diffusions, the Markov and quantum diffusions. For the case of quantum diffusions, the conservation laws are equivalent to Schrödinger's equation. The Markov diffusions were employed by Schrödinger [Sitzungsber. Preuss. Akad. Wiss. Phys. Math Kl. 144 (1931); Ann. Inst. H. Poincaré 2, 269 (1932)], Nelson [Dynamical Theories of Brownian Motion (Princeton University, Princeton, NJ, 1967); Quantum Fluctuations (Princeton University, Princeton, NJ, 1985)], and other researchers to develop stochastic formulations of quantum mechanics, called stochastic mechanics. We propose here an alternative version of stochastic mechanics based on quantum diffusions. A procedure is presented for constructing the quantum diffusion associated to a given wave function. It is shown that quantum diffusions satisfy the uncertainty principle, and have a locality property, whereby given two dynamically uncoupled but statistically correlated particles, the marginal statistics of each particle depend only on the local fields to which the particle is subjected. However, like Wigner's joint probability distribution for the position and momentum of a particle, the finite joint probability densities of quantum diffusions may take negative values.

  14. [Analysis of X-Ray Fluorescence Spectroscopy and Plasma Mass Spectrometry of Pangxidong Composite Granitoid Pluton and Its Implications for Magmatic Differentiation].

    PubMed

    Zeng, Chang-yu; Ding, Ru-xin; Li, Hong-zhong; Zhou, Yong-zhang; Niu, Jia; Zhang, Jie-tang

    2015-11-01

    Pangxidong composite granitoid pluton located in the southwestern margin of Yunkai massif. The metamorphic grade of this pluton increases from outside to inside, that is, banded-augen granitic gneisses, gneissoid granites and granites distribute in order from edge to core. X-Ray Fluorescence Spectroscopy and Plasma Mass Spectrometry are conducted to study the geochemical characteristics of the three types of rocks. The result shows that all the three types of rocks are peraluminous rocks and their contents of main elements and rare earth elements change gradually. From granitic gneisses to granites, the contents of Al₂O₃, CaO, MgO, TiO₂, total rare earth elements and light rare earth elements increase, but the contents of SiO₂ and heavy rare earth elements decrease. It is suggested that the phylogenetic relationship exists between granitic gneisses, gneissoid granites and granites during the multi-stage tectonic evolution process. Furthermore, the remelting of metamorphosed supracrustal rocks in Yunkai massif is probably an important cause of granitoid rocks forming. The evolutionary mechanism is probably that SiO₂ and heavy rare earth elements were melt out from the protolith and gradually enriched upward, but Al₂O₃, CaO, MgO, TiO₂ and light rare earth elements enriched downward.

  15. Carbonate platform, slope, and basinal deposits of Upper Oligocene, Kalimantan, Indonesia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armin, R.A.; Cutler, W.G.; Mahadi, S.

    1987-05-01

    Upper Oligocene platform carbonates (Berai Formation) occur extensively on the Barito shelf in southeastern Kalimantan (Borneo) and are flanked northward by coeval slope and basinal deposits (Bongan Formation) which accumulated in the southwestern part of the Kutei basin. Isolated carbonate buildups equivalent to the Berai Formation also occur within the Kutei basin and were probably deposited on basement highs. The distribution of these facies is fairly well constrained by the study of outcrops, wells, and seismic profiles. The Berai Formation consists of diverse limestone types with a wide range of textures and with dominant skeletal components of large foraminifera, redmore » algae, and corals. Deposition of the Berai Formation occurred in moderate- and high-energy shallow-marine conditions. Slope and basin facies occur in extensional basins adjacent to the shelfal carbonates and peripheral to isolated carbonate buildups. Slope deposits consist of hemipelagic claystone, debris-flow conglomerate, calciturbidite, and volcaniclastic intervals. syndepositional downslope transport of slope deposits was an important process, as indicated by intervals containing redeposited debris flows, intraformational truncation surfaces, slide blocks, and associated shear planes. Recurrent movement on basin-margin faults and local volcanism probably perpetuated instability of slope deposits. Basinal deposits consist of calcareous claystone with intercalated thin, distal calciturbidite and volcaniclastic beds.« less

  16. Crustal architecture of the oblique-slip conjugate margins of George V Land and southeast Australia

    USGS Publications Warehouse

    Stagg, H.M.J.; Reading, A.M.

    2007-01-01

    A conceptual, lithospheric-scale cross-section of the conjugate, oblique-slip margins of George V Land, East Antarctica, and southeast Australia (Otway Basin) has been constructed based on the integration of seismic and sample data. This cross-section is characterised by asymmetry in width and thickness, and depth-dependent crustal extension at breakup in the latest Maastrichtian. The broad Antarctic margin (~360 km apparent rift width) developed on thick crust (~42 km) of the Antarctic craton, whereas the narrow Otway margin (~220 km) developed on the thinner crust (~31 km) of the Ross–Delamerian Orogen. The shallow basement (velocities ~5.5 km.s-1) and the deep continental crust (velocities >6.4 km.s-1) appear to be largely absent across the central rift, while the mid-crustal, probably granitic layer (velocities ~6 km.s-1) is preserved. Comparison with published numerical models suggests that the shallow basement and deep crust may have been removed by simple shear, whereas the mid-crust has been ductilely deformed.

  17. Nonadditive entropies yield probability distributions with biases not warranted by the data.

    PubMed

    Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A

    2013-11-01

    Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.

  18. Diversity and Spatiotemporal Distribution of Larval Odonate Assemblages in Temperate Neotropical Farm Ponds

    PubMed Central

    Pires, Mateus Marques; Kotzian, Carla Bender; Spies, Marcia Regina

    2014-01-01

    Abstract Farm ponds help maintain diversity in altered landscapes. However, studies on the features that drive this type of property in the Neotropics are still lacking, especially for the insect fauna. We analyzed the spatial and temporal distribution of odonate larval assemblages in farm ponds. Odonates were sampled monthly at four farm ponds from March 2008 to February 2009 in a temperate montane region of southern Brazil. A small number of genera were frequent and accounted for most of the dominant fauna. The dominant genera composition differed among ponds. Local spatial drivers such as area, hydroperiod, and margin vegetation structure likely explain these results more than spatial predictors due to the small size of the study area. Circular analysis detected seasonal effect on assemblage abundance but not on richness. Seasonality in abundance was related to the life cycles of a few dominant genera. This result was explained by temperature and not rainfall due to the temperate climate of the region studied. The persistence of dominant genera and the sparse occurrence of many taxa over time probably led to a lack in a seasonal pattern in assemblage richness. PMID:25527585

  19. The Impact of the Major Baltic Inflow of December 2014 on the Mercury Species Distribution in the Baltic Sea.

    PubMed

    Kuss, Joachim; Cordes, Florian; Mohrholz, Volker; Nausch, Günther; Naumann, Michael; Krüger, Siegfried; Schulz-Bull, Detlef E

    2017-10-17

    The Baltic Sea is a marginal sea characterized by stagnation periods of several years. Oxygen consumption in its deep waters leads to the buildup of sulfide from sulfate reduction. Some of the microorganisms responsible for these processes also transform reactive ionic mercury to neurotoxic methylmercury. Episodic inflows of oxygenated saline water from the North Sea temporally re-establish oxic life in deep waters of the Baltic Sea. Thus, this sea is an especially important region to better understand mercury species distributions in connection with variable redox conditions. Mercury species were measured on three Baltic Sea campaigns, during the preinflow, ongoing inflow, and subsiding inflow of water, respectively, to the central basin. The inflowing water caused the removal of total mercury by 600 nmol m -2 and of methylmercury by 214 nmol m -2 in the Gotland Deep, probably via attachment of the mercury compounds to sinking particles. It appears likely that the consequences of the oxygenation of Baltic Sea deep waters, which are the coprecipitation of mercury species and the resettlement of the oxic deep waters, could lead to the enhanced transfer of accumulated mercury and methylmercury to the planktonic food chain and finally to fish.

  20. Tracking of plus-ends reveals microtubule functional diversity in different cell types

    NASA Astrophysics Data System (ADS)

    Shaebani, M. Reza; Pasula, Aravind; Ott, Albrecht; Santen, Ludger

    2016-07-01

    Many cellular processes are tightly connected to the dynamics of microtubules (MTs). While in neuronal axons MTs mainly regulate intracellular trafficking, they participate in cytoskeleton reorganization in many other eukaryotic cells, enabling the cell to efficiently adapt to changes in the environment. We show that the functional differences of MTs in different cell types and regions is reflected in the dynamic properties of MT tips. Using plus-end tracking proteins EB1 to monitor growing MT plus-ends, we show that MT dynamics and life cycle in axons of human neurons significantly differ from that of fibroblast cells. The density of plus-ends, as well as the rescue and catastrophe frequencies increase while the growth rate decreases toward the fibroblast cell margin. This results in a rather stable filamentous network structure and maintains the connection between nucleus and membrane. In contrast, plus-ends are uniformly distributed along the axons and exhibit diverse polymerization run times and spatially homogeneous rescue and catastrophe frequencies, leading to MT segments of various lengths. The probability distributions of the excursion length of polymerization and the MT length both follow nearly exponential tails, in agreement with the analytical predictions of a two-state model of MT dynamics.

  1. Semiparametric regression analysis of interval-censored competing risks data.

    PubMed

    Mao, Lu; Lin, Dan-Yu; Zeng, Donglin

    2017-09-01

    Interval-censored competing risks data arise when each study subject may experience an event or failure from one of several causes and the failure time is not observed directly but rather is known to lie in an interval between two examinations. We formulate the effects of possibly time-varying (external) covariates on the cumulative incidence or sub-distribution function of competing risks (i.e., the marginal probability of failure from a specific cause) through a broad class of semiparametric regression models that captures both proportional and non-proportional hazards structures for the sub-distribution. We allow each subject to have an arbitrary number of examinations and accommodate missing information on the cause of failure. We consider nonparametric maximum likelihood estimation and devise a fast and stable EM-type algorithm for its computation. We then establish the consistency, asymptotic normality, and semiparametric efficiency of the resulting estimators for the regression parameters by appealing to modern empirical process theory. In addition, we show through extensive simulation studies that the proposed methods perform well in realistic situations. Finally, we provide an application to a study on HIV-1 infection with different viral subtypes. © 2017, The International Biometric Society.

  2. Joint time/frequency-domain inversion of reflection data for seabed geoacoustic profiles and uncertainties.

    PubMed

    Dettmer, Jan; Dosso, Stan E; Holland, Charles W

    2008-03-01

    This paper develops a joint time/frequency-domain inversion for high-resolution single-bounce reflection data, with the potential to resolve fine-scale profiles of sediment velocity, density, and attenuation over small seafloor footprints (approximately 100 m). The approach utilizes sequential Bayesian inversion of time- and frequency-domain reflection data, employing ray-tracing inversion for reflection travel times and a layer-packet stripping method for spherical-wave reflection-coefficient inversion. Posterior credibility intervals from the travel-time inversion are passed on as prior information to the reflection-coefficient inversion. Within the reflection-coefficient inversion, parameter information is passed from one layer packet inversion to the next in terms of marginal probability distributions rotated into principal components, providing an efficient approach to (partially) account for multi-dimensional parameter correlations with one-dimensional, numerical distributions. Quantitative geoacoustic parameter uncertainties are provided by a nonlinear Gibbs sampling approach employing full data error covariance estimation (including nonstationary effects) and accounting for possible biases in travel-time picks. Posterior examination of data residuals shows the importance of including data covariance estimates in the inversion. The joint inversion is applied to data collected on the Malta Plateau during the SCARAB98 experiment.

  3. Analytical results for the statistical distribution related to a memoryless deterministic walk: dimensionality effect and mean-field models.

    PubMed

    Terçariol, César Augusto Sangaletti; Martinez, Alexandre Souto

    2005-08-01

    Consider a medium characterized by N points whose coordinates are randomly generated by a uniform distribution along the edges of a unitary d-dimensional hypercube. A walker leaves from each point of this disordered medium and moves according to the deterministic rule to go to the nearest point which has not been visited in the preceding mu steps (deterministic tourist walk). Each trajectory generated by this dynamics has an initial nonperiodic part of t steps (transient) and a final periodic part of p steps (attractor). The neighborhood rank probabilities are parametrized by the normalized incomplete beta function Id= I1/4 [1/2, (d+1) /2] . The joint distribution S(N) (mu,d) (t,p) is relevant, and the marginal distributions previously studied are particular cases. We show that, for the memory-less deterministic tourist walk in the euclidean space, this distribution is Sinfinity(1,d) (t,p) = [Gamma (1+ I(-1)(d)) (t+ I(-1)(d) ) /Gamma(t+p+ I(-1)(d)) ] delta(p,2), where t=0, 1,2, ... infinity, Gamma(z) is the gamma function and delta(i,j) is the Kronecker delta. The mean-field models are the random link models, which correspond to d-->infinity, and the random map model which, even for mu=0 , presents nontrivial cycle distribution [ S(N)(0,rm) (p) proportional to p(-1) ] : S(N)(0,rm) (t,p) =Gamma(N)/ {Gamma[N+1- (t+p) ] N( t+p)}. The fundamental quantities are the number of explored points n(e)=t+p and Id. Although the obtained distributions are simple, they do not follow straightforwardly and they have been validated by numerical experiments.

  4. ProbOnto: ontology and knowledge base of probability distributions.

    PubMed

    Swat, Maciej J; Grenon, Pierre; Wimalaratne, Sarala

    2016-09-01

    Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. http://probonto.org mjswat@ebi.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  5. Time-lagged response of carabid species richness and composition to past management practices and landscape context of semi-natural field margins.

    PubMed

    Alignier, Audrey; Aviron, Stéphanie

    2017-12-15

    Field margins are key features for the maintenance of biodiversity and associated ecosystem services in agricultural landscapes. Little is known about the effects of management practices of old semi-natural field margins, and their historical dimension regarding past management practices and landscape context is rarely considered. In this paper, the relative influence of recent and past management practices and landscape context (during the last five years) were assessed on the local biodiversity (species richness and composition) of carabid assemblages of field margins in agricultural landscapes of northwestern France. The results showed that recent patterns of carabid species richness and composition were best explained by management practices and landscape context measured four or five years ago. It suggests the existence of a time lag in the response of carabid assemblages to past environmental conditions of field margins. The relative contribution of past management practices and past landscape context varied depending on the spatial scale at which landscape context was taken into account. Carabid species richness was higher in grazed or sprayed field margins probably due to increased heterogeneity in habitat conditions. Field margins surrounded by grasslands and crops harbored species associated with open habitats whilst forest species dominated field margins surrounded by woodland. Landscape effect was higher at fine spatial scale, within 50 m around field margins. The present study highlights the importance of considering time-lagged responses of biodiversity when managing environment. It also suggests that old semi-natural field margins should not be considered as undisturbed habitats but more as management units being part of farming activities in agricultural landscapes, as for arable fields. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. PRODIGEN: visualizing the probability landscape of stochastic gene regulatory networks in state and time space.

    PubMed

    Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta

    2017-02-15

    Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.

  7. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  8. The Demand for College Education in Postwar Japan.

    ERIC Educational Resources Information Center

    Nakata, Yoshi-fumi; Mosk, Carl

    1987-01-01

    The authors evaluate the extent to which economic factors underlie the expansion of Japanese college applications. Findings indicate that "marginal investors" respond to shortrun economic factors--including direct costs, household liquidity, and probability of entering a large firm--that govern higher education. Educational quality has…

  9. Geology of the offshore Southeast Georgia Embayment, U.S. Atlantic continental margin, based on multichannel seismic reflection profiles

    USGS Publications Warehouse

    Buffler, Richard T.; Watkins, Joel S.; Dillon, William P.

    1979-01-01

    The sedimentary section is divided into three major seismic intervals. The intervals are separated by unconformities and can be mapped regionally. The oldest interval ranges in age from Early Cretaceous through middle Late Cretaceous, although it may contain Jurassic rocks where it thickens beneath the Blake Plateau. It probably consists of continental to nearshore clastic rocks where it onlaps basement and grades seaward to a restricted carbonate platform facies (dolomite-evaporite). The middle interval (Upper Cretaceous) is characterized by prograding clinoforms interpreted as open marine slope deposits. This interval represents a Late Cretaceous shift of the carbonate shelf margin from the Blake Escarpment shoreward to about its present location, probably due to a combination of co tinued subsidence, an overall Late Cretaceous rise in sea level, and strong currents across the Blake Plateau. The youngest (Cenozoic) interval represents a continued seaward progradation of the continental shelf and slope. Cenozoic sedimentation on the Blake Plateau was much abbreviated owing mainly to strong currents.

  10. Retrogressive hydration of calc-silicate xenoliths in the eastern Bushveld complex: evidence for late magmatic fluid movement

    NASA Astrophysics Data System (ADS)

    Wallmach, T.; Hatton, C. J.; De Waal, S. A.; Gibson, R. L.

    1995-11-01

    Two calc-silicate xenoliths in the Upper Zone of the Bushveld complex contain mineral assemblages which permit delineation of the metamorphic path followed after incorporation of the xenoliths into the magma. Peak metamorphism in these xenoliths occurred at T=1100-1200°C and P <1.5 kbar. Retrograde metamorphism, probably coinciding with the late magmatic stage, is characterized by the breakdown of akermanite to monticellite and wollastonite at 700°C and the growth of vesuvianite from melilite. The latter implies that water-rich fluids (X CO 2 <0.2) were present and probably circulating through the cooling magmatic pile. In contrast, calc-silicate xenoliths within the lower zones of the Bushveld complex, namely in the Marginal and Critical Zones, also contain melilite, monticellite and additional periclase with only rare development of vesuvianite. This suggests that the Upper Zone cumulate pile was much 'wetter' in the late-magmatic stage than the earlier-formed Critical and Marginal Zone cumulate piles.

  11. A comparator-hypothesis account of biased contingency detection.

    PubMed

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Timing and adequate attendance of antenatal care visits among women in Ethiopia.

    PubMed

    Yaya, Sanni; Bishwajit, Ghose; Ekholuenetale, Michael; Shah, Vaibhav; Kadio, Bernard; Udenigwe, Ogochukwu

    2017-01-01

    Although ANC services are increasingly available to women in low and middle-income countries, their inadequate use persists. This suggests a misalignment between aims of the services and maternal beliefs and circumstances. Owing to the dearth of studies examining the timing and adequacy of content of care, this current study aims to investigate the timing and frequency of ANC visits in Ethiopia. Data was obtained from the nationally representative 2011 Ethiopian Demographic and Health Survey (EDHS) which used a two-stage cluster sampling design to provide estimates for the health and demographic variables of interest for the country. Our study focused on a sample of 10,896 women with history of at least one childbirth event. Percentages of timing and adequacy of ANC visits were conducted across the levels of selected factors. Variables which were associated at 5% significance level were examined in the multivariable logistic regression model for association between timing and frequency of ANC visits and the explanatory variables while controlling for covariates. Furthermore, we presented the approach to estimate marginal effects involving covariate-adjusted logistic regression with corresponding 95%CI of delayed initiation of ANC visits and inadequate ANC attendance. The method used involved predicted probabilities added up to a weighted average showing the covariate distribution in the population. Results indicate that 66.3% of women did not use ANC at first trimester and 22.3% had ANC less than 4 visits. The results of this study were unique in that the association between delayed ANC visits and adequacy of ANC visits were examined using multivariable logistic model and the marginal effects using predicted probabilities. Results revealed that older age interval has higher odds of inadequate ANC visits. More so, type of place of residence was associated with delayed initiation of ANC visits, with rural women having the higher odds of delayed initiation of ANC visits (OR = 1.65; 95%CI: 1.26-2.18). However, rural women had 44% reduction in the odds of having inadequate ANC visits. In addition, multi-parity showed higher odds of delayed initiation of ANC visit when compared to the primigravida (OR = 2.20; 95%CI: 1.07-2.69). On the contrary, there was 36% reduction in the odds of multigravida having inadequate ANC visits when compared to the women who were primigravida. There were higher odds of inadequacy in ANC visits among women who engaged in sales/business, agriculture, skilled manual and other jobs when compared to women who currently do not work, after adjusting for covariates. From the predictive margins, assuming the distribution of all covariates remained the same among respondents, but everyone was aged 15-19 years, we would expect 71.8% delayed initiation of ANC visit. If everyone was aged 20-24years, 73.4%; 25-29years, 66.5%; 30-34years, 64.8%; 35-39years, 65.6%; 40-44years, 59.6% and 45-49years, we would expect 70.1% delayed initiation of ANC visit. If instead the distribution of age was as observed and for other covariates remained the same among respondents, but no respondent lived in the rural, we would expect about 61.4% delayed initiation of ANC visit; if however, everyone lived in the rural, and we would expect 71.6% delayed initiation in ANC visit. Model III revealed the predictive margins of all factors examined for delayed initiation for ANC visits, while Model IV presented the predictive marginal effects of the determinants of adequacy of ANC visits. The precise mechanism by which these factors affect ANC visits remain blurred at best. There may be factors on the demand side like the women's empowerment, financial support of the husband, knowledge of ANC visits in the context of timing, frequency and the expectations of ANC visits might be mediating the effects through the factors found associated in this study. Supply side factors like the quality of ANC services, skilled staff, and geographic location of the health centers also mediate their effects through the highlighted factors. Irrespective of the knowledge about the precise mechanism of action, policy makers could focus on improving women's empowerment, improving women's education, reducing wealth inequity and facilitating improved utilization of ANC through modifications on the supply side factors such as geographic location and focus on hard to reach women.

  13. Spatial-temporal variation of marginal land suitable for energy plants from 1990 to 2010 in China

    PubMed Central

    Jiang, Dong; Hao, Mengmeng; Fu, Jingying; Zhuang, Dafang; Huang, Yaohuan

    2014-01-01

    Energy plants are the main source of bioenergy which will play an increasingly important role in future energy supplies. With limited cultivated land resources in China, the development of energy plants may primarily rely on the marginal land. In this study, based on the land use data from 1990 to 2010(every 5 years is a period) and other auxiliary data, the distribution of marginal land suitable for energy plants was determined using multi-factors integrated assessment method. The variation of land use type and spatial distribution of marginal land suitable for energy plants of different decades were analyzed. The results indicate that the total amount of marginal land suitable for energy plants decreased from 136.501 million ha to 114.225 million ha from 1990 to 2010. The reduced land use types are primarily shrub land, sparse forest land, moderate dense grassland and sparse grassland, and large variation areas are located in Guangxi, Tibet, Heilongjiang, Xinjiang and Inner Mongolia. The results of this study will provide more effective data reference and decision making support for the long-term planning of bioenergy resources. PMID:25056520

  14. Short (≤ 1 mm) positive surgical margin and risk of biochemical recurrence after radical prostatectomy.

    PubMed

    Shikanov, Sergey; Marchetti, Pablo; Desai, Vikas; Razmaria, Aria; Antic, Tatjana; Al-Ahmadie, Hikmat; Zagaja, Gregory; Eggener, Scott; Brendler, Charles; Shalhav, Arieh

    2013-04-01

    WHAT'S KNOWN ON THE SUBJECT? AND WHAT DOES THE STUDY ADD?: It has been suggested that a very short positive margin does not confer additional risk of BCR after radical prostatectomy. This study shows that even very short PSM is associated with increased risk of BCR. To re-evaluate, in a larger cohort with longer follow-up, our previously reported finding that a positive surgical margin (PSM) ≤ 1 mm may not confer an additional risk for biochemical recurrence (BCR) compared with a negative surgical margin (NSM). Margin status and length were evaluated in 2866 men treated with radical prostatectomy (RP) for clinically localized prostate cancer at our institution from 1994 to 2009. We compared the BCR-free survival probability of men with NSMs, a PSM ≤ 1 mm, and a PSM < 1 mm using the Kaplan-Meier method and a Cox regression model adjusted for preoperative prostate-specific antigen (PSA) level, age, pathological stage and pathological Gleason score (GS). Compared with a NSM, a PSM ≤ 1 mm was associated with 17% lower 3-year BCR-free survival for men with pT3 and GS ≥ 7 tumours and a 6% lower 3-year BCR-free survival for men with pT2 and GS ≤ 6 tumours (log-rank P < 0.001 for all). In the multivariate model, a PSM ≤ 1 mm was associated with a probability of BCR twice as high as that for a NSM (hazard ratio [HR] 2.2), as were a higher PSA level (HR 1.04), higher pathological stage (HR 2.7) and higher pathological GS (HR 3.7 [all P < 0.001]). In men with non-organ-confined or high grade prostate cancer, a PSM ≤ 1 mm has a significant adverse impact on BCR rates. © 2012 The Authors. BJU International © 2012 BJU International.

  15. Adequacy of inhale/exhale breathhold CT based ITV margins and image-guided registration for free-breathing pancreas and liver SBRT.

    PubMed

    Yang, Wensha; Fraass, Benedick A; Reznik, Robert; Nissen, Nicholas; Lo, Simon; Jamil, Laith H; Gupta, Kapil; Sandler, Howard; Tuli, Richard

    2014-01-09

    To evaluate use of breath-hold CTs and implanted fiducials for definition of the internal target volume (ITV) margin for upper abdominal stereotactic body radiation therapy (SBRT). To study the statistics of inter- and intra-fractional motion information. 11 patients treated with SBRT for locally advanced pancreatic cancer (LAPC) or liver cancer were included in the study. Patients underwent fiducial implantation, free-breathing CT and breath-hold CTs at end inhalation/exhalation. All patients were planned and treated with SBRT using volumetric modulated arc therapy (VMAT). Two margin strategies were studied: Strategy I uses PTV = ITV + 3 mm; Strategy II uses PTV = GTV + 1.5 cm. Both CBCT and kV orthogonal images were taken and analyzed for setup before patient treatments. Tumor motion statistics based on skeletal registration and on fiducial registration were analyzed by fitting to Gaussian functions. All 11 patients met SBRT planning dose constraints using strategy I. Average ITV margins for the 11 patients were 2 mm RL, 6 mm AP, and 6 mm SI. Skeletal registration resulted in high probability (RL = 69%, AP = 4.6%, SI = 39%) that part of the tumor will be outside the ITV. With the 3 mm ITV expansion (Strategy 1), the probability reduced to RL 32%, AP 0.3%, SI 20% for skeletal registration; and RL 1.2%, AP 0%, SI 7% for fiducial registration. All 7 pancreatic patients and 2 liver patients failed to meet SBRT dose constraints using strategy II. The liver dose was increased by 36% for the other 2 liver patients that met the SBRT dose constraints with strategy II. Image guidance matching to skeletal anatomy is inadequate for SBRT positioning in the upper abdomen and usage of fiducials is highly recommended. Even with fiducial implantation and definition of an ITV, a minimal 3 mm planning margin around the ITV is needed to accommodate intra-fractional uncertainties.

  16. Marginal instability threshold of magnetosonic waves in kappa distributed plasma

    NASA Astrophysics Data System (ADS)

    Bashir, M. F.; Manzoor, M. Z.; Ilie, R.; Yoon, P. H.; Miasli, M. S.

    2017-12-01

    The dispersion relation of magnetosonic wave is studied taking the non-extensive anisotropic counter-streaming distribution which follows the Tsallis statistics. The effects of non-extensivity parameter (q), counter-streaming parameter (P) and the wave-particle interaction is analyzed on the growth rate and the marginal instability threshold condition of Magnetosonic (MS) mode to provide the possible explanation of different regions the Bale-diagram obtained from the solar wind data at 1 AU as represented by the temperature anisotropy ( ) vs plasma beta ( ) solar wind data plot. It is shown that the most of the regions of Bale-diagram is bounded by the MS instability under different condition and best fitted by the non-extesnive distribution. The results for the bi-kappa distribution and bi- Maxwellian distribution are also obtained in the limits and respectively.

  17. Incorporating Skew into RMS Surface Roughness Probability Distribution

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  18. Margin and sensitivity methods for security analysis of electric power systems

    NASA Astrophysics Data System (ADS)

    Greene, Scott L.

    Reliable operation of large scale electric power networks requires that system voltages and currents stay within design limits. Operation beyond those limits can lead to equipment failures and blackouts. Security margins measure the amount by which system loads or power transfers can change before a security violation, such as an overloaded transmission line, is encountered. This thesis shows how to efficiently compute security margins defined by limiting events and instabilities, and the sensitivity of those margins with respect to assumptions, system parameters, operating policy, and transactions. Security margins to voltage collapse blackouts, oscillatory instability, generator limits, voltage constraints and line overloads are considered. The usefulness of computing the sensitivities of these margins with respect to interarea transfers, loading parameters, generator dispatch, transmission line parameters, and VAR support is established for networks as large as 1500 buses. The sensitivity formulas presented apply to a range of power system models. Conventional sensitivity formulas such as line distribution factors, outage distribution factors, participation factors and penalty factors are shown to be special cases of the general sensitivity formulas derived in this thesis. The sensitivity formulas readily accommodate sparse matrix techniques. Margin sensitivity methods are shown to work effectively for avoiding voltage collapse blackouts caused by either saddle node bifurcation of equilibria or immediate instability due to generator reactive power limits. Extremely fast contingency analysis for voltage collapse can be implemented with margin sensitivity based rankings. Interarea transfer can be limited by voltage limits, line limits, or voltage stability. The sensitivity formulas presented in this thesis apply to security margins defined by any limit criteria. A method to compute transfer margins by directly locating intermediate events reduces the total number of loadflow iterations required by each margin computation and provides sensitivity information at minimal additional cost. Estimates of the effect of simultaneous transfers on the transfer margins agree well with the exact computations for a network model derived from a portion of the U.S grid. The accuracy of the estimates over a useful range of conditions and the ease of obtaining the estimates suggest that the sensitivity computations will be of practical value.

  19. The Efficacy of Consensus Tree Methods for Summarizing Phylogenetic Relationships from a Posterior Sample of Trees Estimated from Morphological Data.

    PubMed

    O'Reilly, Joseph E; Donoghue, Philip C J

    2018-03-01

    Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data.

  20. The Efficacy of Consensus Tree Methods for Summarizing Phylogenetic Relationships from a Posterior Sample of Trees Estimated from Morphological Data

    PubMed Central

    O’Reilly, Joseph E; Donoghue, Philip C J

    2018-01-01

    Abstract Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data. PMID:29106675

  1. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  2. Effective elastic thickness along the conjugate passive margins of India, Madagascar and Antarctica: A re-evaluation using the Hermite multitaper Bouguer coherence application

    NASA Astrophysics Data System (ADS)

    Ratheesh-Kumar, R. T.; Xiao, Wenjiao

    2018-05-01

    Gondwana correlation studies had rationally positioned the western continental margin of India (WCMI) against the eastern continental margin of Madagascar (ECMM), and the eastern continental margin of India (ECMI) against the eastern Antarctica continental margin (EACM). This contribution computes the effective elastic thickness (Te) of the lithospheres of these once-conjugated continental margins using the multitaper Bouguer coherence method. The results reveal significantly low strength values (Te ∼ 2 km) in the central segment of the WCMI that correlate with consistently low Te values (2-3 km) obtained throughout the entire marginal length of the ECMM. This result is consistent with the previous Te estimates of these margins, and confirms the idea that the low-Te segments in the central part of the WCMI and along the ECMM represents paleo-rift inception points of the lithospheric margins that was thermally and mechanically weakened by the combined action of the Marion hotspot and lithospheric extension during the rifting. The uniformly low-Te value (∼2 km) along the EACM indicates a mechanically weak lithospheric margin, probably due to considerable stretching of the lithosphere, considering the fact that this margin remained almost stationary throughout its rift history. In contrast, the ECMI has comparatively high-Te variations (5-11 km) that lack any correlation with the regional tectonic setting. Using gravity forward and inversion applications, we find a leading order of influence of sediment load on the flexural properties of this marginal lithosphere. The study concludes that the thick pile of the Bengal Fan sediments in the ECMI masks and has erased the signal of the original load-induced topography, and its gravity effect has biased the long-wavelength part of the observed gravity signal. The hence uncorrelated flat topography and deep lithospheric flexure together contribute a bias in the flexure modeling, which likely accounts a relatively high Te estimate.

  3. Vertical changes in the probability distribution of downward irradiance within the near-surface ocean under sunny conditions

    NASA Astrophysics Data System (ADS)

    Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw

    2011-07-01

    Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.

  4. Marginalized multilevel hurdle and zero-inflated models for overdispersed and correlated count data with excess zeros.

    PubMed

    Kassahun, Wondwosen; Neyens, Thomas; Molenberghs, Geert; Faes, Christel; Verbeke, Geert

    2014-11-10

    Count data are collected repeatedly over time in many applications, such as biology, epidemiology, and public health. Such data are often characterized by the following three features. First, correlation due to the repeated measures is usually accounted for using subject-specific random effects, which are assumed to be normally distributed. Second, the sample variance may exceed the mean, and hence, the theoretical mean-variance relationship is violated, leading to overdispersion. This is usually allowed for based on a hierarchical approach, combining a Poisson model with gamma distributed random effects. Third, an excess of zeros beyond what standard count distributions can predict is often handled by either the hurdle or the zero-inflated model. A zero-inflated model assumes two processes as sources of zeros and combines a count distribution with a discrete point mass as a mixture, while the hurdle model separately handles zero observations and positive counts, where then a truncated-at-zero count distribution is used for the non-zero state. In practice, however, all these three features can appear simultaneously. Hence, a modeling framework that incorporates all three is necessary, and this presents challenges for the data analysis. Such models, when conditionally specified, will naturally have a subject-specific interpretation. However, adopting their purposefully modified marginalized versions leads to a direct marginal or population-averaged interpretation for parameter estimates of covariate effects, which is the primary interest in many applications. In this paper, we present a marginalized hurdle model and a marginalized zero-inflated model for correlated and overdispersed count data with excess zero observations and then illustrate these further with two case studies. The first dataset focuses on the Anopheles mosquito density around a hydroelectric dam, while adolescents' involvement in work, to earn money and support their families or themselves, is studied in the second example. Sub-models, which result from omitting zero-inflation and/or overdispersion features, are also considered for comparison's purpose. Analysis of the two datasets showed that accounting for the correlation, overdispersion, and excess zeros simultaneously resulted in a better fit to the data and, more importantly, that omission of any of them leads to incorrect marginal inference and erroneous conclusions about covariate effects. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Probabilistic margin evaluation on accidental transients for the ASTRID reactor project

    NASA Astrophysics Data System (ADS)

    Marquès, Michel

    2014-06-01

    ASTRID is a technological demonstrator of Sodium cooled Fast Reactor (SFR) under development. The conceptual design studies are being conducted in accordance with the Generation IV reactor objectives, particularly in terms of improving safety. For the hypothetical events, belonging to the accidental category "severe accident prevention situations" having a very low frequency of occurrence, the safety demonstration is no more based on a deterministic demonstration with conservative assumptions on models and parameters but on a "Best-Estimate Plus Uncertainty" (BEPU) approach. This BEPU approach ispresented in this paper for an Unprotected Loss-of-Flow (ULOF) event. The Best-Estimate (BE) analysis of this ULOFt ransient is performed with the CATHARE2 code, which is the French reference system code for SFR applications. The objective of the BEPU analysis is twofold: first evaluate the safety margin to sodium boiling in taking into account the uncertainties on the input parameters of the CATHARE2 code (twenty-two uncertain input parameters have been identified, which can be classified into five groups: reactor power, accident management, pumps characteristics, reactivity coefficients, thermal parameters and head losses); secondly quantify the contribution of each input uncertainty to the overall uncertainty of the safety margins, in order to refocusing R&D efforts on the most influential factors. This paper focuses on the methodological aspects of the evaluation of the safety margin. At least for the preliminary phase of the project (conceptual design), a probabilistic criterion has been fixed in the context of this BEPU analysis; this criterion is the value of the margin to sodium boiling, which has a probability 95% to be exceeded, obtained with a confidence level of 95% (i.e. the M5,95percentile of the margin distribution). This paper presents two methods used to assess this percentile: the Wilks method and the Bootstrap method ; the effectiveness of the two methods is compared on the basis of 500 simulations performed with theCATHARE2 code. We conclude that, with only 100 simulations performed with the CATHARE2 code, which is a number of simulations workable in the conceptual design phase of the ASTRID project where the models and the hypothesis are often modified, it is best in order to evaluate the percentile M5,95 of the margin to sodium boiling to use the bootstrap method, which will provide a slightly conservative result. On the other hand, in order to obtain an accurate estimation of the percentileM5,95, for the safety report for example, it will be necessary to perform at least 300 simulations with the CATHARE2 code. In this case, both methods (Wilks and Bootstrap) would give equivalent results.

  6. Impact of prostate weight on probability of positive surgical margins in patients with low-risk prostate cancer after robotic-assisted laparoscopic radical prostatectomy.

    PubMed

    Marchetti, Pablo E; Shikanov, Sergey; Razmaria, Aria A; Zagaja, Gregory P; Shalhav, Arieh L

    2011-03-01

    To evaluate the impact of prostate weight (PW) on probability of positive surgical margin (PSM) in patients undergoing robotic-assisted radical prostatectomy (RARP) for low-risk prostate cancer. The cohort consisted of 690 men with low-risk prostate cancer (clinical stage T1c, prostate-specific antigen <10 ng/mL, biopsy Gleason score ≤6) who underwent RARP with bilateral nerve-sparing at our institution by 1 of 2 surgeons from 2003 to 2009. PW was obtained from the pathologic specimen. The association between probability of PSM and PW was assessed with univariate and multivariate logistic regression analysis. A PSM was identified in 105 patients (15.2%). Patients with PSM had significant higher prostate-specific antigen (P = .04), smaller prostates (P = .0001), higher Gleason score (P = .004), and higher pathologic stage (P < .0001). After logistic regression, we found a significant inverse relation between PSM and PW (OR 0.97%; 95% confidence interval [CI] 0.96, 0.99; P = .0003) in univariate analysis. This remained significant in the multivariate model (OR 0.98%; 95% CI 0.96, 0.99; P = .006) adjusting for age, body mass index, surgeon experience, pathologic Gleason score, and pathologic stage. In this multivariate model, the predicted probability of PSM for 25-, 50-, 100-, and 150-g prostates were 22% (95% CI 16%, 30%), 13% (95% CI 11%, 16%), 5% (95% CI 1%, 8%), and 1% (95% CI 0%, 3%), respectively. Lower PW is independently associated with higher probability of PSM in low-risk patients undergoing RARP with bilateral nerve-sparing. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. The influence of maximum magnitude on seismic-hazard estimates in the Central and Eastern United States

    USGS Publications Warehouse

    Mueller, C.S.

    2010-01-01

    I analyze the sensitivity of seismic-hazard estimates in the central and eastern United States (CEUS) to maximum magnitude (mmax) by exercising the U.S. Geological Survey (USGS) probabilistic hazard model with several mmax alternatives. Seismicity-based sources control the hazard in most of the CEUS, but data seldom provide an objective basis for estimating mmax. The USGS uses preferred mmax values of moment magnitude 7.0 and 7.5 for the CEUS craton and extended margin, respectively, derived from data in stable continental regions worldwide. Other approaches, for example analysis of local seismicity or judgment about a source's seismogenic potential, often lead to much smaller mmax. Alternative models span the mmax ranges from the 1980s Electric Power Research Institute/Seismicity Owners Group (EPRI/SOG) analysis. Results are presented as haz-ard ratios relative to the USGS national seismic hazard maps. One alternative model specifies mmax equal to moment magnitude 5.0 and 5.5 for the craton and margin, respectively, similar to EPRI/SOG for some sources. For 2% probability of exceedance in 50 years (about 0.0004 annual probability), the strong mmax truncation produces hazard ratios equal to 0.35-0.60 for 0.2-sec spectral acceleration, and 0.15-0.35 for 1.0-sec spectral acceleration. Hazard-controlling earthquakes interact with mmax in complex ways. There is a relatively weak dependence on probability level: hazardratios increase 0-15% for 0.002 annual exceedance probability and decrease 5-25% for 0.00001 annual exceedance probability. Although differences at some sites are tempered when faults are added, mmax clearly accounts for some of the discrepancies that are seen in comparisons between USGS-based and EPRI/SOG-based hazard results.

  8. Predicting the probability of slip in gait: methodology and distribution study.

    PubMed

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  9. Integrated-Circuit Pseudorandom-Number Generator

    NASA Technical Reports Server (NTRS)

    Steelman, James E.; Beasley, Jeff; Aragon, Michael; Ramirez, Francisco; Summers, Kenneth L.; Knoebel, Arthur

    1992-01-01

    Integrated circuit produces 8-bit pseudorandom numbers from specified probability distribution, at rate of 10 MHz. Use of Boolean logic, circuit implements pseudorandom-number-generating algorithm. Circuit includes eight 12-bit pseudorandom-number generators, outputs are uniformly distributed. 8-bit pseudorandom numbers satisfying specified nonuniform probability distribution are generated by processing uniformly distributed outputs of eight 12-bit pseudorandom-number generators through "pipeline" of D flip-flops, comparators, and memories implementing conditional probabilities on zeros and ones.

  10. Determinants of Pseudogymnoascus destructans within bat hibernacula: Implications for surveillance and management of white-nose syndrome

    USGS Publications Warehouse

    Verant, Michelle L.; Bohuski, Elizabeth A.; Richgels, Katherine L. D.; Olival, Kevin J.; Epstein, Jonathan H.; Blehert, David

    2018-01-01

    Fungal diseases are an emerging global problem affecting human health, food security and biodiversity. Ability of many fungal pathogens to persist within environmental reservoirs can increase extinction risks for host species and presents challenges for disease control. Understanding factors that regulate pathogen spread and persistence in these reservoirs is critical for effective disease management.White-nose syndrome (WNS) is a disease of hibernating bats caused by Pseudogymnoascus destructans (Pd), a fungus that establishes persistent environmental reservoirs within bat hibernacula, which contribute to seasonal disease transmission dynamics in bats. However, host and environmental factors influencing distribution of Pdwithin these reservoirs are unknown.We used model selection on longitudinally collected field data to test multiple hypotheses describing presence–absence and abundance of Pd in environmental substrates and on bats within hibernacula at different stages of WNS.First detection of Pd in the environment lagged up to 1 year after first detection on bats within that hibernaculum. Once detected, the probability of detecting Pd within environmental samples from a hibernaculum increased over time and was higher in sediment compared to wall surfaces. Temperature had marginal effects on the distribution of Pd. For bats, prevalence and abundance of Pd were highest on Myotis lucifugus and on bats with visible signs of WNS.Synthesis and applications. Our results indicate that distribution of Pseudogymnoascus destructans (Pd) within a hibernaculum is driven primarily by bats with delayed establishment of environmental reservoirs. Thus, collection of samples from Myotis lucifugus, or from sediment if bats cannot be sampled, should be prioritized to improve detection probabilities for Pd surveillance. Long-term persistence of Pd in sediment suggests that disease management for white-nose syndrome should address risks of sustained transmission from environmental reservoirs.

  11. Use of Arthropod Rarity for Area Prioritisation: Insights from the Azorean Islands

    PubMed Central

    Fattorini, Simone; Cardoso, Pedro; Rigal, François; Borges, Paulo A. V.

    2012-01-01

    We investigated the conservation concern of Azorean forest fragments and the entire Terceira Island surface using arthropod species vulnerability as defined by the Kattan index, which is based on species rarity. Species rarity was evaluated according to geographical distribution (endemic vs. non endemic species), habitat specialization (distribution across biotopes) and population size (individuals collected in standardized samples). Geographical rarity was considered at ‘global’ scale (species endemic to the Azorean islands) and ‘regional’ scale (single island endemics). Measures of species vulnerability were combined into two indices of conservation concern for each forest fragment: (1) the Biodiversity Conservation Concern index, BCC, which reflects the average rarity score of the species present in a site, and (2) one proposed here and termed Biodiversity Conservation Weight, BCW, which reflects the sum of rarity scores of the same species assemblage. BCW was preferable to prioritise the areas with highest number of vulnerable species, whereas BCC helped the identification of areas with few, but highly threatened species due to a combination of different types of rarity. A novel approach is introduced in which BCC and BCW indices were also adapted to deal with probabilities of occurrence instead of presence/absence data. The new probabilistic indices, termed pBCC and pBCW, were applied to Terceira Island for which we modelled species distributions to reconstruct species occurrence with different degree of probability also in areas from which data were not available. The application of the probabilistic indices revealed that some island sectors occupied by secondary vegetation, and hence not included in the current set of protected areas, may in fact host some rare species. This result suggests that protecting marginal non-natural areas which are however reservoirs of vulnerable species may also be important, especially when areas with well preserved primary habitats are scarce. PMID:22479498

  12. Multivariate quantile mapping bias correction: an N-dimensional probability density function transform for climate model simulations of multiple variables

    NASA Astrophysics Data System (ADS)

    Cannon, Alex J.

    2018-01-01

    Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin, particularly for annual maxima of the FWI distribution and spatiotemporal autocorrelation of precipitation fields.

  13. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    NASA Astrophysics Data System (ADS)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  14. Early Oral Tongue Squamous Cell Carcinoma: Sampling of Margins From Tumor Bed and Worse Local Control.

    PubMed

    Maxwell, Jessica H; Thompson, Lester D R; Brandwein-Gensler, Margaret S; Weiss, Bernhard G; Canis, Martin; Purgina, Bibianna; Prabhu, Arpan V; Lai, Chi; Shuai, Yongli; Carroll, William R; Morlandt, Anthony; Duvvuri, Umamaheswar; Kim, Seungwon; Johnson, Jonas T; Ferris, Robert L; Seethala, Raja; Chiosea, Simion I

    2015-12-01

    Positive margins are associated with poor prognosis among patients with oral tongue squamous cell carcinoma (SCC). However, wide variation exists in the margin sampling technique. To determine the effect of the margin sampling technique on local recurrence (LR) in patients with stage I or II oral tongue SCC. A retrospective study was conducted from January 1, 1986, to December 31, 2012, in 5 tertiary care centers following tumor resection and elective neck dissection in 280 patients with pathologic (p)T1-2 pN0 oral tongue SCC. Analysis was conducted from June 1, 2013, to January 20, 2015. In group 1 (n = 119), tumor bed margins were not sampled. In group 2 (n = 61), margins were examined from the glossectomy specimen, found to be positive or suboptimal, and revised with additional tumor bed margins. In group 3 (n = 100), margins were primarily sampled from the tumor bed without preceding examination of the glossectomy specimen. The margin status (both as a binary [positive vs negative] and continuous [distance to the margin in millimeters] variable) and other clinicopathologic parameters were compared across the 3 groups and correlated with LR. Local recurrence. Age, sex, pT stage, lymphovascular or perineural invasion, and adjuvant radiation treatment were similar across the 3 groups. The probability of LR-free survival at 3 years was 0.9 and 0.8 in groups 1 and 3, respectively (P = .03). The frequency of positive glossectomy margins was lowest in group 1 (9 of 117 [7.7%]) compared with groups 2 and 3 (28 of 61 [45.9%] and 23 of 95 [24.2%], respectively) (P < .001). Even after excluding cases with positive margins, the median distance to the closest margin was significantly narrower in group 3 (2 mm) compared with group 1 (3 mm) (P = .008). The status (positive vs negative) of margins obtained from the glossectomy specimen correlated with LR (P = .007), while the status of tumor bed margins did not. The status of the tumor bed margin was 24% sensitive (95% CI, 16%-34%) and 92% specific (95% CI, 85%-97%) for detecting a positive glossectomy margin. The margin sampling technique affects local control in patients with oral tongue SCC. Reliance on margin sampling from the tumor bed is associated with worse local control, most likely owing to narrower margin clearance and greater incidence of positive margins. A resection specimen-based margin assessment is recommended.

  15. Early Oral Tongue Squamous Cell Carcinoma Sampling of Margins From Tumor Bed and Worse Local Control

    PubMed Central

    Maxwell, Jessica H.; Thompson, Lester D. R.; Brandwein-Gensler, Margaret S.; Weiss, Bernhard G.; Canis, Martin; Purgina, Bibianna; Prabhu, Arpan V.; Lai, Chi; Shuai, Yongli; Carroll, William R.; Morlandt, Anthony; Duvvuri, Umamaheswar; Kim, Seungwon; Johnson, Jonas T.; Ferris, Robert L.; Seethala, Raja; Chiosea, Simion I.

    2017-01-01

    IMPORTANCE Positive margins are associated with poor prognosis among patients with oral tongue squamous cell carcinoma (SCC). However, wide variation exists in the margin sampling technique. OBJECTIVE To determine the effect of the margin sampling technique on local recurrence (LR) in patients with stage I or II oral tongue SCC. DESIGN, SETTING, AND PARTICIPANTS A retrospective study was conducted from January 1, 1986, to December 31, 2012, in 5 tertiary care centers following tumor resection and elective neck dissection in 280 patients with pathologic (p)T1-2 pN0 oral tongue SCC. Analysis was conducted from June 1, 2013, to January 20, 2015. INTERVENTIONS In group 1 (n = 119), tumor bed margins were not sampled. In group 2 (n = 61), margins were examined from the glossectomy specimen, found to be positive or suboptimal, and revised with additional tumor bed margins. In group 3 (n = 100), margins were primarily sampled from the tumor bed without preceding examination of the glossectomy specimen. The margin status (both as a binary [positive vs negative] and continuous [distance to the margin in millimeters] variable) and other clinicopathologic parameters were compared across the 3 groups and correlated with LR. MAIN OUTCOMES AND MEASURES Local recurrence. RESULTS Age, sex, pT stage, lymphovascular or perineural invasion, and adjuvant radiation treatment were similar across the 3 groups. The probability of LR-free survival at 3 years was 0.9 and 0.8 in groups 1 and 3, respectively (P = .03). The frequency of positive glossectomy margins was lowest in group 1 (9 of 117 [7.7%]) compared with groups 2 and 3 (28 of 61 [45.9%] and 23 of 95 [24.2%], respectively) (P < .001). Even after excluding cases with positive margins, the median distance to the closest margin was significantly narrower in group 3 (2 mm) compared with group 1 (3 mm) (P = .008). The status (positive vs negative) of margins obtained from the glossectomy specimen correlated with LR (P = .007), while the status of tumor bed margins did not. The status of the tumor bed margin was 24% sensitive (95% CI, 16%-34%) and 92% specific (95% CI, 85%-97%) for detecting a positive glossectomy margin. CONCLUSIONS AND RELEVANCE The margin sampling technique affects local control in patients with oral tongue SCC. Reliance on margin sampling from the tumor bed is associated with worse local control, most likely owing to narrower margin clearance and greater incidence of positive margins. A resection specimen–based margin assessment is recommended. PMID:26225798

  16. Salt stress aggravates boron toxicity symptoms in banana leaves by impairing guttation.

    PubMed

    Shapira, O R; Israeli, Yair; Shani, Uri; Schwartz, Amnon

    2013-02-01

    Boron (B) is known to accumulate in the leaf margins of different plant species, arguably a passive consequence of enhanced transpiration at the ends of the vascular system. However, transpiration rate is not the only factor affecting ion distribution. We examine an alternative hypothesis, suggesting the participation of the leaf bundle sheath in controlling radial water and solute transport from the xylem to the mesophyll in analogy to the root endodermis. In banana, excess B that remains confined to the vascular system is effectively disposed of via dissolution in the guttation fluid; therefore, impairing guttation should aggravate B damage to the leaf margins. Banana plants were subjected to increasing B concentrations. Guttation rates were manipulated by imposing a moderate osmotic stress. Guttation fluid was collected and analysed continuously. The distribution of ions across the lamina was determined. Impairing guttation indeed led to increased B damage to the leaf margins. The kinetics of ion concentration in guttation samples revealed major differences between ion species, corresponding to their distribution in the lamina dry matter. We provide evidence that the distribution pattern of B and other ions across banana leaves depends on active filtration of the transpiration stream and on guttation. © 2012 Blackwell Publishing Ltd.

  17. IRT Item Parameter Recovery with Marginal Maximum Likelihood Estimation Using Loglinear Smoothing Models

    ERIC Educational Resources Information Center

    Casabianca, Jodi M.; Lewis, Charles

    2015-01-01

    Loglinear smoothing (LLS) estimates the latent trait distribution while making fewer assumptions about its form and maintaining parsimony, thus leading to more precise item response theory (IRT) item parameter estimates than standard marginal maximum likelihood (MML). This article provides the expectation-maximization algorithm for MML estimation…

  18. Analysis of induction and establishment of dwarf bunt of wheat under marginal climatic conditions.

    USDA-ARS?s Scientific Manuscript database

    Dwarf bunt caused by Tilletia contraversa has limited distribution due to essential climatic requirements; primarily persistent snow cover. The pathogen is a quarantine organism in several countries outside of the USA, some of which may have marginal climate for the disease, including the People’s ...

  19. Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2009-01-01

    Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.

  20. Topological transitions and freezing in XY models and Coulomb gases with quenched disorder: renormalization via traveling waves

    NASA Astrophysics Data System (ADS)

    Carpentier, David; Le Doussal, Pierre

    2000-11-01

    We study the two dimensional XY model with quenched random phases and its Coulomb gas formulation. A novel renormalization group (RG) method is developed which allows to study perturbatively the glassy low temperature XY phase and the transition at which frozen topological defects (vortices) proliferate. This RG approach is constructed both from the replicated Coulomb gas and, equivalently without the use of replicas, using the probability distribution of the local disorder (random defect core energy). By taking into account the fusion of environments (i.e., charge fusion in the replicated Coulomb gas) this distribution is shown to obey a Kolmogorov's type (KPP) non linear RG equation which admits traveling wave solutions and exhibits a freezing phenomenon analogous to glassy freezing in Derrida's random energy models. The resulting physical picture is that the distribution of local disorder becomes broad below a freezing temperature and that the transition is controlled by rare favorable regions for the defects, the density of which can be used as the new perturbative parameter. The determination of marginal directions at the disorder induced transition is shown to be related to the well studied front velocity selection problem in the KPP equation and the universality of the novel critical behaviour obtained here to the known universality of the corrections to the front velocity. Applications to other two dimensional problems are mentioned at the end.

  1. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  2. On land-use modeling: A treatise of satellite imagery data and misclassification error

    NASA Astrophysics Data System (ADS)

    Sandler, Austin M.

    Recent availability of satellite-based land-use data sets, including data sets with contiguous spatial coverage over large areas, relatively long temporal coverage, and fine-scale land cover classifications, is providing new opportunities for land-use research. However, care must be used when working with these datasets due to misclassification error, which causes inconsistent parameter estimates in the discrete choice models typically used to model land-use. I therefore adapt the empirical correction methods developed for other contexts (e.g., epidemiology) so that they can be applied to land-use modeling. I then use a Monte Carlo simulation, and an empirical application using actual satellite imagery data from the Northern Great Plains, to compare the results of a traditional model ignoring misclassification to those from models accounting for misclassification. Results from both the simulation and application indicate that ignoring misclassification will lead to biased results. Even seemingly insignificant levels of misclassification error (e.g., 1%) result in biased parameter estimates, which alter marginal effects enough to affect policy inference. At the levels of misclassification typical in current satellite imagery datasets (e.g., as high as 35%), ignoring misclassification can lead to systematically erroneous land-use probabilities and substantially biased marginal effects. The correction methods I propose, however, generate consistent parameter estimates and therefore consistent estimates of marginal effects and predicted land-use probabilities.

  3. Fast Episodes of West-Mediterranean-Tyrrhenian Oceanic Opening and Revisited Relations with Tectonic Setting

    PubMed Central

    Savelli, Carlo

    2015-01-01

    Extension and calc-alkaline volcanism of the submerged orogen of alpine age (OAA) initiated in Early Oligocene (~33/32 Ma) and reached the stage of oceanic opening in Early-Miocene (Burdigalian), Late-Miocene and Late-Pliocene. In the Burdigalian (~20–16 Ma) period of widespread volcanism of calcalkaline type on the margins of oceanic domain, seafloor spreading originated the deep basins of north Algeria (western part of OAA) and Sardinia/Provence (European margin). Conversely, when conjugate margins’ volcanism has been absent or scarce seafloor spreading formed the plains Vavilov (7.5–6.3 Ma) and Marsili (1.87–1.67 Ma) within OAA eastern part (Tyrrhenian Sea). The contrast between occurrence and lack of margin’s igneous activity probably implies the diversity of the geotectonic setting at the times of oceanization. It appears that the Burdigalian calcalkaline volcanism on the continental margins developed in the absence of subduction. The WNW-directed subduction of African plate probably commenced at ~16/15 Ma (waning Burdigalian seafloor spreading) after ~18/16 Ma of rifting. Space-time features indicate that calcalkaline volcanism is not linked only to subduction. From this view, temporal gap would exist between the steep subduction beneath the Apennines and the previous, flat-type plunge of European plate with opposite direction producing the OAA accretion and double vergence. PMID:26391973

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.

    In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less

  5. Positive phase space distributions and uncertainty relations

    NASA Technical Reports Server (NTRS)

    Kruger, Jan

    1993-01-01

    In contrast to a widespread belief, Wigner's theorem allows the construction of true joint probabilities in phase space for distributions describing the object system as well as for distributions depending on the measurement apparatus. The fundamental role of Heisenberg's uncertainty relations in Schroedinger form (including correlations) is pointed out for these two possible interpretations of joint probability distributions. Hence, in order that a multivariate normal probability distribution in phase space may correspond to a Wigner distribution of a pure or a mixed state, it is necessary and sufficient that Heisenberg's uncertainty relation in Schroedinger form should be satisfied.

  6. Edge Effects along a Seagrass Margin Result in an Increased Grazing Risk on Posidonia australis Transplants.

    PubMed

    Statton, John; Gustin-Craig, Samuel; Dixon, Kingsley W; Kendrick, Gary A

    2015-01-01

    A key issue in habitat restoration are the changes in ecological processes that occur when fragments of habitat are lost, resulting in the persistence of habitat-degraded margins. Margins often create or enhance opportunities for negative plant-herbivore interactions, preventing natural or assisted re-establishment of native vegetation into the degraded area. However, at some distance from the habitat margin these negative interactions may relax. Here, we posit that the intensity of species interactions in a fragmented Posidonia australis seagrass meadow may be spatially dependent on proximity to the seagrass habitat edge, whereby the risk of grazing is high and the probability of survival of seagrass transplants is low. To test this, transplants were planted 2 m within the meadow, on the meadow edge at 0m, and at 2m, 10m, 30m, 50m and 100m distance from the edge of the seagrass meadow into the unvegetated sand sheet. There was an enhanced grazing risk 0-10m from the edge, but decreased sharply with increasing distances (>30m). Yet, the risk of grazing was minimal inside the seagrass meadow, indicating that grazers may use the seagrass meadow for refuge but are not actively grazing within it. The relationship between short-term herbivory risk and long-term survival was not straightforward, suggesting that other environmental filters are also affecting survival of P. australis transplants within the study area. We found that daily probability of herbivory was predictable and operating over a small spatial scale at the edge of a large, intact seagrass meadow. These findings highlight the risk from herbivory can be high, and a potential contributing factor to seagrass establishment in restoration programs.

  7. Edge Effects along a Seagrass Margin Result in an Increased Grazing Risk on Posidonia australis Transplants

    PubMed Central

    Statton, John; Gustin-Craig, Samuel; Dixon, Kingsley W.; Kendrick, Gary A.

    2015-01-01

    A key issue in habitat restoration are the changes in ecological processes that occur when fragments of habitat are lost, resulting in the persistence of habitat-degraded margins. Margins often create or enhance opportunities for negative plant-herbivore interactions, preventing natural or assisted re-establishment of native vegetation into the degraded area. However, at some distance from the habitat margin these negative interactions may relax. Here, we posit that the intensity of species interactions in a fragmented Posidonia australis seagrass meadow may be spatially dependent on proximity to the seagrass habitat edge, whereby the risk of grazing is high and the probability of survival of seagrass transplants is low. To test this, transplants were planted 2 m within the meadow, on the meadow edge at 0m, and at 2m, 10m, 30m, 50m and 100m distance from the edge of the seagrass meadow into the unvegetated sand sheet. There was an enhanced grazing risk 0-10m from the edge, but decreased sharply with increasing distances (>30m). Yet, the risk of grazing was minimal inside the seagrass meadow, indicating that grazers may use the seagrass meadow for refuge but are not actively grazing within it. The relationship between short-term herbivory risk and long-term survival was not straightforward, suggesting that other environmental filters are also affecting survival of P. australis transplants within the study area. We found that daily probability of herbivory was predictable and operating over a small spatial scale at the edge of a large, intact seagrass meadow. These findings highlight the risk from herbivory can be high, and a potential contributing factor to seagrass establishment in restoration programs. PMID:26465926

  8. Ubiquity of Benford's law and emergence of the reciprocal distribution

    DOE PAGES

    Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.

    2016-04-07

    In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less

  9. Sci—Thur PM: Planning and Delivery — 04: Respiratory margin derivation and verification in partial breast irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quirk, S; Conroy, L; Smith, WL

    Partial breast irradiation (PBI) following breast-conserving surgery is emerging as an effective means to achieve local control and reduce irradiated breast volume. Patients are planned on a static CT image; however, treatment is delivered while the patient is free-breathing. Respiratory motion can degrade plan quality by reducing target coverage and/or dose homogeneity. A variety of methods can be used to determine the required margin for respiratory motion in PBI. We derive geometric and dosimetric respiratory 1D margin. We also verify the adequacy of the typical 5 mm respiratory margin in 3D by evaluating plan quality for increasing respiratory amplitudes (2–20more » mm). Ten PBI plans were used for dosimetric evaluation. A database of volunteer respiratory data, with similar characteristics to breast cancer patients, was used for this study. We derived a geometric 95%-margin of 3 mm from the population respiratory data. We derived a dosimetric 95%-margin of 2 mm by convolving 1D dose profiles with respiratory probability density functions. The 5 mm respiratory margin is possibly too large when 1D coverage is assessed and could lead to unnecessary normal tissue irradiation. Assessing margins only for coverage may be insufficient; 3D dosimetric assessment revealed degradation in dose homogeneity is the limiting factor, not target coverage. Hotspots increased even for the smallest respiratory amplitudes, while target coverage only degraded at amplitudes greater than 10 mm. The 5 mm respiratory margin is adequate for coverage, but due to plan quality degradation, respiratory management is recommended for patients with respiratory amplitudes greater than 10 mm.« less

  10. The Influence of Topography on Subaqueous Sediment Gravity Flows and the Resultant Deposits: Examples from Deep-water Systems in Offshore Morocco and Offshore Trinidad

    NASA Astrophysics Data System (ADS)

    Deng, H.; Wood, L.; Overeem, I.; Hutton, E.

    2016-12-01

    Submarine topography has a fundamental control on the movement of sediment gravity flows as well as the distribution, morphology, and internal heterogeneity of resultant overlying, healing-phase, deep-water reservoirs. Some of the most complex deep-water topography is generated through both destructive and constructive mass transport processes. A series of numerical models using Sedflux software have been constructed over high resolution mass transport complexes (MTCs) top paleobathymetric surfaces mapped from 3D seismic data in offshore Morocco and offshore eastern Trinidad. Morocco's margin is characterized by large, extant rafted blocks and a flow perpendicular fabric. Trinidad's margin is characterized by muddier, plastic flows and isolated extrusive diapiric buttresses. In addition, Morocco's margin is a dry, northern latitude margin that lacks major river inputs, while Trinidad's margin is an equatorial, wet climate that is fed by the Orinoco River and delta. These models quantitatively delineate the interaction of healing-phase gravity flows on the tops of two very different topographies and provide insights into healing-phase reservoir distribution and stratigraphic trap development. Slopes roughness, curvatures, and surface shapes are measured and quantified relative to input points to quantify depositional surface character. A variety of sediment gravity flow types have been input and the resultant interval assessed for thickness and distribution relative to key topography parameters. Mathematical relationships are to be analyzed and compared with seismic data interpretation of healing-phase interval character, toward an improved model of gravity sedimentation and topography interactions.

  11. Spatial Probability Distribution of Strata's Lithofacies and its Impacts on Land Subsidence in Huairou Emergency Water Resources Region of Beijing

    NASA Astrophysics Data System (ADS)

    Li, Y.; Gong, H.; Zhu, L.; Guo, L.; Gao, M.; Zhou, C.

    2016-12-01

    Continuous over-exploitation of groundwater causes dramatic drawdown, and leads to regional land subsidence in the Huairou Emergency Water Resources region, which is located in the up-middle part of the Chaobai river basin of Beijing. Owing to the spatial heterogeneity of strata's lithofacies of the alluvial fan, ground deformation has no significant positive correlation with groundwater drawdown, and one of the challenges ahead is to quantify the spatial distribution of strata's lithofacies. The transition probability geostatistics approach provides potential for characterizing the distribution of heterogeneous lithofacies in the subsurface. Combined the thickness of clay layer extracted from the simulation, with deformation field acquired from PS-InSAR technology, the influence of strata's lithofacies on land subsidence can be analyzed quantitatively. The strata's lithofacies derived from borehole data were generalized into four categories and their probability distribution in the observe space was mined by using the transition probability geostatistics, of which clay was the predominant compressible material. Geologically plausible realizations of lithofacies distribution were produced, accounting for complex heterogeneity in alluvial plain. At a particular probability level of more than 40 percent, the volume of clay defined was 55 percent of the total volume of strata's lithofacies. This level, equaling nearly the volume of compressible clay derived from the geostatistics, was thus chosen to represent the boundary between compressible and uncompressible material. The method incorporates statistical geological information, such as distribution proportions, average lengths and juxtaposition tendencies of geological types, mainly derived from borehole data and expert knowledge, into the Markov chain model of transition probability. Some similarities of patterns were indicated between the spatial distribution of deformation field and clay layer. In the area with roughly similar water table decline, locations in the subsurface having a higher probability for the existence of compressible material occur more than that in the location with a lower probability. Such estimate of spatial probability distribution is useful to analyze the uncertainty of land subsidence.

  12. The exact probability distribution of the rank product statistics for replicated experiments.

    PubMed

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  13. A Composite Likelihood Inference in Latent Variable Models for Ordinal Longitudinal Responses

    ERIC Educational Resources Information Center

    Vasdekis, Vassilis G. S.; Cagnone, Silvia; Moustaki, Irini

    2012-01-01

    The paper proposes a composite likelihood estimation approach that uses bivariate instead of multivariate marginal probabilities for ordinal longitudinal responses using a latent variable model. The model considers time-dependent latent variables and item-specific random effects to be accountable for the interdependencies of the multivariate…

  14. Outdoor Recreation Constraints: An Examination of Race, Gender, and Rural Dwelling

    Treesearch

    Cassandra Y. Johnson; J. Michael Bowker; H. Ken Cordell

    2001-01-01

    We assess whether traditionally marginalized groups in American society (African-Americans, women, rural dwellers) perceive more constraints to outdoor recreation participation than other groups. A series of logistic regressions are applied to a national recreation survey and used to model the probability that individuals perceive certain constraints to...

  15. New Approach to Total Dose Specification for Spacecraft Electronics

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael

    2017-01-01

    Variability of the space radiation environment is investigated with regard to total dose specification for spacecraft electronics. It is shown to have a significant impact. A new approach is developed for total dose requirements that replaces the radiation design margin concept with failure probability during a mission.

  16. Joint Bayesian Estimation of Quasar Continua and the Lyα Forest Flux Probability Distribution Function

    NASA Astrophysics Data System (ADS)

    Eilers, Anna-Christina; Hennawi, Joseph F.; Lee, Khee-Gan

    2017-08-01

    We present a new Bayesian algorithm making use of Markov Chain Monte Carlo sampling that allows us to simultaneously estimate the unknown continuum level of each quasar in an ensemble of high-resolution spectra, as well as their common probability distribution function (PDF) for the transmitted Lyα forest flux. This fully automated PDF regulated continuum fitting method models the unknown quasar continuum with a linear principal component analysis (PCA) basis, with the PCA coefficients treated as nuisance parameters. The method allows one to estimate parameters governing the thermal state of the intergalactic medium (IGM), such as the slope of the temperature-density relation γ -1, while marginalizing out continuum uncertainties in a fully Bayesian way. Using realistic mock quasar spectra created from a simplified semi-numerical model of the IGM, we show that this method recovers the underlying quasar continua to a precision of ≃ 7 % and ≃ 10 % at z = 3 and z = 5, respectively. Given the number of principal component spectra, this is comparable to the underlying accuracy of the PCA model itself. Most importantly, we show that we can achieve a nearly unbiased estimate of the slope γ -1 of the IGM temperature-density relation with a precision of +/- 8.6 % at z = 3 and +/- 6.1 % at z = 5, for an ensemble of ten mock high-resolution quasar spectra. Applying this method to real quasar spectra and comparing to a more realistic IGM model from hydrodynamical simulations would enable precise measurements of the thermal and cosmological parameters governing the IGM, albeit with somewhat larger uncertainties, given the increased flexibility of the model.

  17. Modelling Spatial Dependence Structures Between Climate Variables by Combining Mixture Models with Copula Models

    NASA Astrophysics Data System (ADS)

    Khan, F.; Pilz, J.; Spöck, G.

    2017-12-01

    Spatio-temporal dependence structures play a pivotal role in understanding the meteorological characteristics of a basin or sub-basin. This further affects the hydrological conditions and consequently will provide misleading results if these structures are not taken into account properly. In this study we modeled the spatial dependence structure between climate variables including maximum, minimum temperature and precipitation in the Monsoon dominated region of Pakistan. For temperature, six, and for precipitation four meteorological stations have been considered. For modelling the dependence structure between temperature and precipitation at multiple sites, we utilized C-Vine, D-Vine and Student t-copula models. For temperature, multivariate mixture normal distributions and for precipitation gamma distributions have been used as marginals under the copula models. A comparison was made between C-Vine, D-Vine and Student t-copula by observational and simulated spatial dependence structure to choose an appropriate model for the climate data. The results show that all copula models performed well, however, there are subtle differences in their performances. The copula models captured the patterns of spatial dependence structures between climate variables at multiple meteorological sites, however, the t-copula showed poor performance in reproducing the dependence structure with respect to magnitude. It was observed that important statistics of observed data have been closely approximated except of maximum values for temperature and minimum values for minimum temperature. Probability density functions of simulated data closely follow the probability density functions of observational data for all variables. C and D-Vines are better tools when it comes to modelling the dependence between variables, however, Student t-copulas compete closely for precipitation. Keywords: Copula model, C-Vine, D-Vine, Spatial dependence structure, Monsoon dominated region of Pakistan, Mixture models, EM algorithm.

  18. A primer on marginal effects-part II: health services research applications.

    PubMed

    Onukwugha, E; Bergtold, J; Jain, R

    2015-02-01

    Marginal analysis evaluates changes in a regression function associated with a unit change in a relevant variable. The primary statistic of marginal analysis is the marginal effect (ME). The ME facilitates the examination of outcomes for defined patient profiles or individuals while measuring the change in original units (e.g., costs, probabilities). The ME has a long history in economics; however, it is not widely used in health services research despite its flexibility and ability to provide unique insights. This article, the second in a two-part series, discusses practical issues that arise in the estimation and interpretation of the ME for a variety of regression models often used in health services research. Part one provided an overview of prior studies discussing ME followed by derivation of ME formulas for various regression models relevant for health services research studies examining costs and utilization. The current article illustrates the calculation and interpretation of ME in practice and discusses practical issues that arise during the implementation, including: understanding differences between software packages in terms of functionality available for calculating the ME and its confidence interval, interpretation of average marginal effect versus marginal effect at the mean, and the difference between ME and relative effects (e.g., odds ratio). Programming code to calculate ME using SAS, STATA, LIMDEP, and MATLAB are also provided. The illustration, discussion, and application of ME in this two-part series support the conduct of future studies applying the concept of marginal analysis.

  19. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    NASA Astrophysics Data System (ADS)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  20. Dosimetric evaluation of planning target volume margin reduction for prostate cancer via image-guided intensity-modulated radiation therapy

    NASA Astrophysics Data System (ADS)

    Hwang, Taejin; Kang, Sei-Kwon; Cheong, Kwang-Ho; Park, Soah; Yoon, Jai-Woong; Han, Taejin; Kim, Haeyoung; Lee, Meyeon; Kim, Kyoung-Joo; Bae, Hoonsik; Suh, Tae-Suk

    2015-07-01

    The aim of this study was to quantitatively estimate the dosimetric benefits of the image-guided radiation therapy (IGRT) system for the prostate intensity-modulated radiation therapy (IMRT) delivery. The cases of eleven patients who underwent IMRT for prostate cancer without a prostatectomy at our institution between October 2012 and April 2014 were retrospectively analyzed. For every patient, clinical target volume (CTV) to planning target volume (PTV) margins were uniformly used: 3 mm, 5 mm, 7 mm, 10 mm, 12 mm, and 15 mm. For each margin size, the IMRT plans were independently optimized by one medical physicist using Pinnalce3 (ver. 8.0.d, Philips Medical System, Madison, WI) in order to maintain the plan quality. The maximum geometrical margin (MGM) for every CT image set, defined as the smallest margin encompassing the rectum at least at one slice, was between 13 mm and 26 mm. The percentage rectum overlapping PTV (%V ROV ), the rectal normal tissue complication probability (NTCP) and the mean rectal dose (%RD mean ) increased in proportion to the increase of PTV margin. However the bladder NTCP remained around zero to some extent regardless of the increase of PTV margin while the percentage bladder overlapping PTV (%V BOV ) and the mean bladder dose (%BD mean ) increased in proportion to the increase of PTV margin. Without relatively large rectum or small bladder, the increase observed for rectal NTCP, %RDmean and %BD mean per 1-mm PTV margin size were 1.84%, 2.44% and 2.90%, respectively. Unlike the behavior of the rectum or the bladder, the maximum dose on each femoral head had little effect on PTV margin. This quantitative study of the PTV margin reduction supported that IG-IMRT has enhanced the clinical effects over prostate cancer with the reduction of normal organ complications under the similar level of PTV control.

  1. [Spatio-temporal distribution of carabids and spiders between semi-natural field margin and the adjacent crop fields in agricultural landscape].

    PubMed

    Zhang, Xu Zhu; Han, Yin; Yu, Zhen Rong; Liu, Yun Hui

    2017-06-18

    This study was conducted before and after harvesting of wheat and maize in a typical agricultural landscape of the North China Plain. We investigated the diversity of two important natural enemy groups, carabids and spiders, using pitfall traps at crop field margin with different vegetation structures and their neighboring crop field. Throughout the comparison of the spatial and temporal distribution of the diversity of carabids and spiders in field margin and neighboring field, and the investigation of the relationship between arthropod communities and vegetation structure, this study aimed to understand the role of semi-natural field margin in biodiversity conservation of different natural enemy taxa. Results showed that the abundance of spiders was significantly higher in field margin than in neighboring fields over the entire period. No significant difference of the diversity of carabids in field margin and crop field was observed, but the community composition was different. Number of spider families increased in field margin but deceased in crop field after harvesting, indicating a migration activity between field and field margin. Vegetation structure in the field margin had different association with carabids than with spiders, with diversity of dominant carabid species positively associated with herb coverage and negatively with wood coverage, while the diversity of spider family Linyphiidae was positively associated with herb coverage only. Semi-natural habitat benefited the conservation of the diversity of arthropod natural enemies in crop field via promoting their dispersal to crop field, while such impacts differed from different vegetation structures and varied from target beneficial natural enemy communities. Future studies should focus on in-depth understanding of the food and habitat source requirement of different natural enemy taxa, and hence to design suitable semi-natural habitats to maintain a high diversity of natural enemy communities.

  2. The Continental Margins Program in Georgia

    USGS Publications Warehouse

    Cocker, M.D.; Shapiro, E.A.

    1999-01-01

    From 1984 to 1993, the Georgia Geologic Survey (GGS) participated in the Minerals Management Service-funded Continental Margins Program. Geological and geophysical data acquisition focused on offshore stratigraphic framework studies, phosphate-bearing Miocene-age strata, distribution of heavy minerals, near-surface alternative sources of groundwater, and development of a PC-based Coastal Geographic Information System (GIS). Seven GGS publications document results of those investigations. In addition to those publications, direct benefits of the GGS's participation include an impetus to the GGS's investigations of economic minerals on the Georgia coast, establishment of a GIS that includes computer hardware and software, and seeds for additional investigations through the information and training acquired as a result of the Continental Margins Program. These addtional investigations are quite varied in scope, and many were made possible because of GIS expertise gained as a result of the Continental Margins Program. Future investigations will also reap the benefits of the Continental Margins Program.From 1984 to 1993, the Georgia Geologic Survey (GGS) participated in the Minerals Management Service-funded Continental Margins Program. Geological and geophysical data acquisition focused on offshore stratigraphic framework studies, phosphate-bearing Miocene-age strata, distribution of heavy minerals, near-surface alternative sources of groundwater, and development of a PC-based Coastal Geographic Information System (GIS). Seven GGS publications document results of those investigations. In addition to those publications, direct benefits of the GGS's participation include an impetus to the GGS's investigations of economic minerals on the Georgia coast, establishment of a GIS that includes computer hardware and software, and seeds for additional investigations through the information and training acquired as a result of the Continental Margins Program. These additional investigations are quite varied in scope, and many were made possible because of GIS expertise gained as a result of the Continental Margins Program. Future investigations will also reap the benefits of the Continental Margins Program.

  3. 17 CFR 242.101 - Activities by distribution participants.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... (CONTINUED) REGULATIONS M, SHO, ATS, AC, AND NMS AND CUSTOMER MARGIN REQUIREMENTS FOR SECURITY FUTURES Regulation M § 242.101 Activities by distribution participants. (a) Unlawful Activity. In connection with a...

  4. Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.

    1971-01-01

    A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.

  5. High resolution neodymium characterization along the Mediterranean Sea margins: implications for ɛNd modeling.

    NASA Astrophysics Data System (ADS)

    Ayache, Mohamed; Dutay, Jean-claude; Arsouze, Thomas; Jeandel, Catherine; Revillon, Sidonie

    2016-04-01

    An extensive compilation of published neodymium (Nd) concentrations and isotopic compositions (ɛNd) was realized in order to establish a new database and a map (using a high resolution geological map of the area) of the distribution of these parameters for all the Mediterranean margins. Data were extracted from different kinds of samples: river solid discharge deposited on the shelf, sedimentary material collected on the margin or geological material outcropping above or close to a margin. Additional analyses of surface sediments were done, in order to improve this dataset in key areas (e.g Sicilian strait). The Mediterranean margin Nd isotopic signatures vary from non-radiogenic values around the Gulf of Lions, (ɛNd values -11) to radiogenic values around the Aegean and the Levantine sub-basins up to +6. Using a high resolution regional oceanic model (1/12° of horizontal resolution), ɛNd distribution was simulated for the first time in the Mediterranean Sea. The high resolution of the model provides the opportunity to study in more details the processes governing the Nd isotope distribution in the marine environment. This work highlights that a significant interannual variability of ɛNd distribution in seawater could occur. In particular, important hydrological events such as the Eastern Mediterranean Transient (EMT), associated with deep water formed in the Aegean sub-basin, could induce a shift in Nd IC at intermediate depths that could be noticeable in the Western part of the basin. This highlights that the temporal and geographical variations of ɛNd could represent an interesting insight of Nd as a quasi-conservative tracer of water masses in the Mediterranean Sea, in particular in the context of paleo-oceanographic applications, i.e. to explore if EMT-type signatures occurred in the past (Roether et al., 2014, Gacic et al., 2011).

  6. Analysis of induction and establishment of dwarf bunt of wheat under marginal climatic conditions.

    USDA-ARS?s Scientific Manuscript database

    Dwarf bunt caused by Tilletia contraversa is a disease of winter wheat that has a limited geographic distribution due to specific winter climate requirements. The pathogen is listed as a quarantine organism by several countries that may have wheat production areas with inadequate or marginal climat...

  7. Some bivariate distributions for modeling the strength properties of lumber

    Treesearch

    Richard A. Johnson; James W. Evans; David W. Green

    Accurate modeling of the joint stochastic nature of the strength properties of dimension lumber is essential to the determination of reliability-based design safety factors. This report reviews the major techniques for obtaining bivariate distributions and then discusses bivariate distributions whose marginal distributions suggest they might be useful for modeling the...

  8. A corrected formulation for marginal inference derived from two-part mixed models for longitudinal semi-continuous data

    PubMed Central

    Su, Li; Farewell, Vernon T

    2013-01-01

    For semi-continuous data which are a mixture of true zeros and continuously distributed positive values, the use of two-part mixed models provides a convenient modelling framework. However, deriving population-averaged (marginal) effects from such models is not always straightforward. Su et al. presented a model that provided convenient estimation of marginal effects for the logistic component of the two-part model but the specification of marginal effects for the continuous part of the model presented in that paper was based on an incorrect formulation. We present a corrected formulation and additionally explore the use of the two-part model for inferences on the overall marginal mean, which may be of more practical relevance in our application and more generally. PMID:24201470

  9. Hydrothermal and tectonic activity in northern Yellowstone Lake, Wyoming

    USGS Publications Warehouse

    Johnson, S.Y.; Stephenson, W.J.; Morgan, L.A.; Shanks, Wayne C.; Pierce, K.L.

    2003-01-01

    Yellowstone National Park is the site of one of the world's largest calderas. The abundance of geothermal and tectonic activity in and around the caldera, including historic uplift and subsidence, makes it necessary to understand active geologic processes and their associated hazards. To that end, we here use an extensive grid of high-resolution seismic reflection profiles (???450 km) to document hydrothermal and tectonic features and deposits in northern Yellowstone Lake. Sublacustrine geothermal features in northern Yellowstone Lake include two of the largest known hydrothermal explosion craters, Mary Bay and Elliott's. Mary Bay explosion breccia is distributed uniformly around the crater, whereas Elliott's crater breccia has an asymmetric distribution and forms a distinctive, ???2-km-long, hummocky lobe on the lake floor. Hydrothermal vents and low-relief domes are abundant on the lake floor; their greatest abundance is in and near explosion craters and along linear fissures. Domed areas on the lake floor that are relatively unbreached (by vents) are considered the most likely sites of future large hydrothermal explosions. Four submerged shoreline terraces along the margins of northern Yellowstone Lake add to the Holocene record or postglacial lake-level fluctuations attributed to "heavy breathing" of the Yellowstone magma reservoir and associated geothermal system. The Lake Hotel fault cuts through northwestern Yellowstone Lake and represents part of a 25-km-long distributed extensional deformation zone. Three postglacial ruptures indicate a slip rate of ???0.27 to 0.34 mm/yr. The largest (3.0 m slip) and most recent event occurred in the past ???2100 yr. Although high heat flow in the crust limits the rupture area of this fault zone, future earthquakes of magnitude ???5.3 to 6.5 are possible. Earthquakes and hydrothermal explosions have probably triggered landslides, common features around the lake margins. Few high-resolution seismic reflection surveys have been conducted in lakes in active volcanic areas. Our data reveal active geothermal features with unprecedented resolution and provide important analogues for recognition of comparable features and potential hazards in other subaqueous geothermal environments.

  10. Challenges estimating the return period of extreme floods for reinsurance applications

    NASA Astrophysics Data System (ADS)

    Raven, Emma; Busby, Kathryn; Liu, Ye

    2013-04-01

    Mapping and modelling extreme natural events is fundamental within the insurance and reinsurance industry for assessing risk. For example, insurers might use a 1 in 100-year flood hazard map to set the annual premium of a property, whilst a reinsurer might assess the national scale loss associated with the 1 in 200-year return period for capital and regulatory requirements. Using examples from a range of international flood projects, we focus on exploring how to define what the n-year flood looks like for predictive uses in re/insurance applications, whilst considering challenges posed by short historical flow records and the spatial and temporal complexities of flood. First, we shall explore the use of extreme value theory (EVT) statistics for extrapolating data beyond the range of observations in a marginal analysis. In particular, we discuss how to estimate the return period of historical flood events and explore the impact that a range of statistical decisions have on these estimates. Decisions include: (1) selecting which distribution type to apply (e.g. generalised Pareto distribution (GPD) vs. generalised extreme value distribution (GEV)); (2) if former, the choice of the threshold above which the GPD is fitted to the data; and (3) the necessity to perform a cluster analysis to group flow peaks to temporally represent individual flood events. Second, we summarise a specialised multivariate extreme value model, which combines the marginal analysis above with dependence modelling to generate industry standard event sets containing thousands of simulated, equi-probable floods across a region/country. These events represent the typical range of anticipated flooding across a region and can be used to estimate the largest or most widespread events that are expected to occur. Finally, we summarise how a reinsurance catastrophe model combines the event set with detailed flood hazard maps to estimate the financial cost of floods; both the full event set and also individual extreme events. Since the predicted loss estimates, typically in the form of a curve plotting return period against modelled loss, are used in the pricing of reinsurance, we demonstrate the importance of the estimated return period and understanding the uncertainties associated with it.

  11. Soil Carbon Change and Net Energy Associated with Biofuel Production on Marginal Lands: A Regional Modeling Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandaru, Varaprasad; Izaurralde, Roberto C.; Manowitz, David H.

    2013-12-01

    The use of marginal lands (MLs) for biofuel production has been contemplated as a promising solution for meeting biofuel demands. However, there have been concerns with spatial location of MLs, their inherent biofuel potential, and possible environmental consequences with the cultivation of energy crops. Here, we developed a new quantitative approach that integrates high-resolution land cover and land productivity maps and uses conditional probability density functions for analyzing land use patterns as a function of land productivity to classify the agricultural lands. We subsequently applied this method to determine available productive croplands (P-CLs) and non-crop marginal lands (NC-MLs) in amore » nine-county Southern Michigan. Furthermore, Spatially Explicit Integrated Modeling Framework (SEIMF) using EPIC (Environmental Policy Integrated Climate) was used to understand the net energy (NE) and soil organic carbon (SOC) implications of cultivating different annual and perennial production systems.« less

  12. Statistical characterization of portal images and noise from portal imaging systems.

    PubMed

    González-López, Antonio; Morales-Sánchez, Juan; Verdú-Monedero, Rafael; Larrey-Ruiz, Jorge

    2013-06-01

    In this paper, we consider the statistical characteristics of the so-called portal images, which are acquired prior to the radiotherapy treatment, as well as the noise that present the portal imaging systems, in order to analyze whether the well-known noise and image features in other image modalities, such as natural image, can be found in the portal imaging modality. The study is carried out in the spatial image domain, in the Fourier domain, and finally in the wavelet domain. The probability density of the noise in the spatial image domain, the power spectral densities of the image and noise, and the marginal, joint, and conditional statistical distributions of the wavelet coefficients are estimated. Moreover, the statistical dependencies between noise and signal are investigated. The obtained results are compared with practical and useful references, like the characteristics of the natural image and the white noise. Finally, we discuss the implication of the results obtained in several noise reduction methods that operate in the wavelet domain.

  13. Reliability of Hull Girder Ultimate Strength of Steel Ships

    NASA Astrophysics Data System (ADS)

    Da-wei, Gao; Gui-jie, Shi

    2018-03-01

    Hull girder ultimate strength is an evaluation index reflecting the true safety margin or structural redundancy about container ships. Especially, after the hull girder fracture accident of the MOL COMFORT, the 8,000TEU class large container ship, on June 17 2013, larger container ship safety has been paid on much more attention. In this paper, different methods of calculating hull girder ultimate strength are firstly discussed and compared with. The bending ultimate strength can be analyzed by nonlinear finite element method (NFEM) and increment-iterative method, and also the shear ultimate strength can be analyzed by NFEM and simple equations. Then, the probability distribution of hull girder wave loads and still water loads of container ship are summarized. At last, the reliability of hull girder ultimate strength under bending moment and shear forces for three container ships is analyzed by using a first order method. The conclusions can be applied to give guidance for ship design and safety evaluation.

  14. Model selection and Bayesian inference for high-resolution seabed reflection inversion.

    PubMed

    Dettmer, Jan; Dosso, Stan E; Holland, Charles W

    2009-02-01

    This paper applies Bayesian inference, including model selection and posterior parameter inference, to inversion of seabed reflection data to resolve sediment structure at a spatial scale below the pulse length of the acoustic source. A practical approach to model selection is used, employing the Bayesian information criterion to decide on the number of sediment layers needed to sufficiently fit the data while satisfying parsimony to avoid overparametrization. Posterior parameter inference is carried out using an efficient Metropolis-Hastings algorithm for high-dimensional models, and results are presented as marginal-probability depth distributions for sound velocity, density, and attenuation. The approach is applied to plane-wave reflection-coefficient inversion of single-bounce data collected on the Malta Plateau, Mediterranean Sea, which indicate complex fine structure close to the water-sediment interface. This fine structure is resolved in the geoacoustic inversion results in terms of four layers within the upper meter of sediments. The inversion results are in good agreement with parameter estimates from a gravity core taken at the experiment site.

  15. Investment risk in bioenergy crops

    DOE PAGES

    Skevas, Theodoros; Swinton, Scott M.; Tanner, Sophia; ...

    2015-11-18

    Here, perennial, cellulosic bioenergy crops represent a risky investment. The potential for adoption of these crops depends not only on mean net returns, but also on the associated probability distributions and on the risk preferences of farmers. Using 6-year observed crop yield data from highly productive and marginally productive sites in the southern Great Lakes region and assuming risk neutrality, we calculate expected breakeven biomass yields and prices compared to corn ( Zea mays L.) as a benchmark. Next we develop Monte Carlo budget simulations based on stochastic crop prices and yields. The crop yield simulations decompose yield risk intomore » three components: crop establishment survival, time to maturity, and mature yield variability. Results reveal that corn with harvest of grain and 38% of stover (as cellulosic bioenergy feedstock) is both the most profitable and the least risky investment option. It dominates all perennial systems considered across a wide range of farmer risk preferences. Although not currently attractive for profit-oriented farmers who are risk neutral or risk averse, perennial bioenergy crops.« less

  16. Investment risk in bioenergy crops

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skevas, Theodoros; Swinton, Scott M.; Tanner, Sophia

    Here, perennial, cellulosic bioenergy crops represent a risky investment. The potential for adoption of these crops depends not only on mean net returns, but also on the associated probability distributions and on the risk preferences of farmers. Using 6-year observed crop yield data from highly productive and marginally productive sites in the southern Great Lakes region and assuming risk neutrality, we calculate expected breakeven biomass yields and prices compared to corn ( Zea mays L.) as a benchmark. Next we develop Monte Carlo budget simulations based on stochastic crop prices and yields. The crop yield simulations decompose yield risk intomore » three components: crop establishment survival, time to maturity, and mature yield variability. Results reveal that corn with harvest of grain and 38% of stover (as cellulosic bioenergy feedstock) is both the most profitable and the least risky investment option. It dominates all perennial systems considered across a wide range of farmer risk preferences. Although not currently attractive for profit-oriented farmers who are risk neutral or risk averse, perennial bioenergy crops.« less

  17. Detection of a dynamic topography signal in last interglacial sea-level records

    PubMed Central

    Austermann, Jacqueline; Mitrovica, Jerry X.; Huybers, Peter; Rovere, Alessio

    2017-01-01

    Estimating minimum ice volume during the last interglacial based on local sea-level indicators requires that these indicators are corrected for processes that alter local sea level relative to the global average. Although glacial isostatic adjustment is generally accounted for, global scale dynamic changes in topography driven by convective mantle flow are generally not considered. We use numerical models of mantle flow to quantify vertical deflections caused by dynamic topography and compare predictions at passive margins to a globally distributed set of last interglacial sea-level markers. The deflections predicted as a result of dynamic topography are significantly correlated with marker elevations (>95% probability) and are consistent with construction and preservation attributes across marker types. We conclude that a dynamic topography signal is present in the elevation of last interglacial sea-level records and that the signal must be accounted for in any effort to determine peak global mean sea level during the last interglacial to within an accuracy of several meters. PMID:28695210

  18. Genetic Diversity of Hibiscus tiliaceus (Malvaceae) in China Assessed using AFLP Markers

    PubMed Central

    TANG, TIAN; ZHONG, YANG; JIAN, SHUGUANG; SHI, SUHUA

    2003-01-01

    Amplified fragment length polymorphism (AFLP) markers were used to investigate the genetic variations within and among nine natural populations of Hibiscus tiliaceus in China. DNA from 145 individuals was amplified with eight primer pairs. No polymorphisms were found among the 20 samples of a marginal population of recent origin probably due to a founder effect. Across the other 125 individuals, 501 of 566 bands (88·5 %) were polymorphic, and 125 unique AFLP phenotypes were observed. Estimates of genetic diversity agreed with life history traits of H. tiliaceus and geographical distribution. AMOVA analysis revealed that most genetic diversity resided within populations (84·8 %), which corresponded to results reported for outcrossing plants. The indirect estimate of gene flow based on ϕST was moderate (Nm = 1·395). Long-distance dispersal of floating seeds and local environments may play an important role in shaping the genetic diversity of the population and the genetic structure of this species. PMID:12930729

  19. Mapping Mesophotic Reefs Along the Brazilian Continental Margin

    NASA Astrophysics Data System (ADS)

    Bastos, A.; Moura, R.; Amado Filho, G.; Ferreira, L.; Boni, G.; Vedoato, F.; D'Agostini, D.; Lavagnino, A. C.; Leite, M. D.; Quaresma, V.

    2017-12-01

    Submerged or drowned reefs constitute an important geological record of sea level variations, forming the substrate for the colonization of modern benthic mesophotic communities. Although mapping mesophotic reefs has increased in the last years, their spatial distribution is poorly known and the worldwide occurrence of this reef habitat maybe underestimated. The importance in recognizing the distribution of mesophotic reefs is that they can act as a refuge for corals during unsuitable environmental conditions and a repository for shallow water corals. Here we present the result of several acoustic surveys that mapped and discovered new mesophotic reefs along the Eastern and Equatorial Brazilian Continental Margin. Seabed mapping was carried out using multibeam and side scan sonars. Ground truthing was obtained using drop camera or scuba diving. Mesophotic reefs were mapped in water depths varying from 30 to 100m and under distinct oceanographic conditions, especially in terms of river load input and shelf width. Reefs showed distinct morphologies, from low relief banks and paleovalleys to shelf edge ridges. Extensive occurrence of low relief banks were mapped along the most important coralline complex province in the South Atlantic, the Abrolhos Shelf. These 30 to 40m deep banks, have no more than 3 meters in height and may represent fringing reefs formed during sea level stabilization. Paleovalleys mapped along the eastern margin showed the occurrence of coralgal ledges along the channel margins. Paleovalleys are usually deeper than 45m and are associated with outer shelf rhodolith beds. Shelf edge ridges (80 to 120m deep) were mapped along both margins and are related to red algal encrusting irregular surfaces that have more than 3m in height, forming a rigid substrate for coral growth. Along the Equatorial Margin, off the Amazon mouth, shelf edge patch reefs and rhodolith beds forming encrusting surfaces and shelf edge ridges were mapped in water depths greater than 100m. Thus, the occurrence of mesophotic reefs along the Brazilian Margin is influenced by transgressive morphological features, which could be used as a surrogate for mesophotic reef distribution. The extensive occurrence of rhodolith beds on the outer shelf characterizes most of these reefs.

  20. Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow

    NASA Astrophysics Data System (ADS)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-12-01

    Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.

  1. Probability Distribution of Turbulent Kinetic Energy Dissipation Rate in Ocean: Observations and Approximations

    NASA Astrophysics Data System (ADS)

    Lozovatsky, I.; Fernando, H. J. S.; Planella-Morato, J.; Liu, Zhiyu; Lee, J.-H.; Jinadasa, S. U. P.

    2017-10-01

    The probability distribution of turbulent kinetic energy dissipation rate in stratified ocean usually deviates from the classic lognormal distribution that has been formulated for and often observed in unstratified homogeneous layers of atmospheric and oceanic turbulence. Our measurements of vertical profiles of micro-scale shear, collected in the East China Sea, northern Bay of Bengal, to the south and east of Sri Lanka, and in the Gulf Stream region, show that the probability distributions of the dissipation rate ɛ˜r in the pycnoclines (r ˜ 1.4 m is the averaging scale) can be successfully modeled by the Burr (type XII) probability distribution. In weakly stratified boundary layers, lognormal distribution of ɛ˜r is preferable, although the Burr is an acceptable alternative. The skewness Skɛ and the kurtosis Kɛ of the dissipation rate appear to be well correlated in a wide range of Skɛ and Kɛ variability.

  2. Evaluation of the Three Parameter Weibull Distribution Function for Predicting Fracture Probability in Composite Materials

    DTIC Science & Technology

    1978-03-01

    for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented

  3. Adequate margins for random setup uncertainties in head-and-neck IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astreinidou, Eleftheria; Bel, Arjan; Raaijmakers, Cornelis P.J.

    2005-03-01

    Purpose: To investigate the effect of random setup uncertainties on the highly conformal dose distributions produced by intensity-modulated radiotherapy (IMRT) for clinical head-and-neck cancer patients and to determine adequate margins to account for those uncertainties. Methods and materials: We have implemented in our clinical treatment planning system the possibility of simulating normally distributed patient setup displacements, translations, and rotations. The planning CT data of 8 patients with Stage T1-T3N0M0 oropharyngeal cancer were used. The clinical target volumes of the primary tumor (CTV{sub primary}) and of the lymph nodes (CTV{sub elective}) were expanded by 0.0, 1.5, 3.0, and 5.0 mm inmore » all directions, creating the planning target volumes (PTVs). We performed IMRT dose calculation using our class solution for each PTV margin, resulting in the conventional static plans. Then, the system recalculated the plan for each positioning displacement derived from a normal distribution with {sigma} = 2 mm and {sigma} = 4 mm (standard deviation) for translational deviations and {sigma} = 1 deg for rotational deviations. The dose distributions of the 30 fractions were summed, resulting in the actual plan. The CTV dose coverage of the actual plans was compared with that of the static plans. Results: Random translational deviations of {sigma} = 2 mm and rotational deviations of {sigma} = 1 deg did not affect the CTV{sub primary} volume receiving 95% of the prescribed dose (V{sub 95}) regardless of the PTV margin used. A V{sub 95} reduction of 3% and 1% for a 0.0-mm and 1.5-mm PTV margin, respectively, was observed for {sigma} = 4 mm. The V{sub 95} of the CTV{sub elective} contralateral was approximately 1% and 5% lower than that of the static plan for {sigma} = 2 mm and {sigma} = 4 mm, respectively, and for PTV margins < 5.0 mm. An additional reduction of 1% was observed when rotational deviations were included. The same effect was observed for the CTV{sub elective} ipsilateral but with smaller dose differences than those for the contralateral side. The effect of the random uncertainties on the mean dose to the parotid glands was not significant. The maximal dose to the spinal cord increased by a maximum of 3 Gy. Conclusions: The margins to account for random setup uncertainties, in our clinical IMRT solution, should be 1.5 mm and 3.0 mm in the case of {sigma} = 2 mm and {sigma} = 4 mm, respectively, for the CTV{sub primary}. Larger margins (5.0 mm), however, should be applied to the CTV{sub elective}, if the goal of treatment is a V{sub 95} value of at least 99%.« less

  4. Sandstone detrital modes and basinal setting of the Trinity Peninsula Group, northern Graham Land, Antarctic Peninsula: A preliminary survey

    NASA Astrophysics Data System (ADS)

    Smellie, J. L.

    Sandstone detrital modes for a representative sample of the Trinity Peninsula Group in northern Graham Land are described and assessed. Whereas the volumetrically dominant quartz and feldspar were derived principally from erosion of a plutonic and high-rank metamorphic terrane, the lithic population was derived mainly from a volcanic cover. The data clearly indicate the presence of two major sandstone suites (petro-facies I and II) with distinctive and probably separate provenances. Further scope for subdivision is limited by the small sample set, but four petrofacies (Ia, Ib, IIa, and IIb) may be present, three of which correspond with previously described lithostratigraphical units (Legoupil, Hope Bay, and View Point formations). The sample distribution and detrital modes enable approximate geographical limits to be assigned to each petrofacies for the first time, although the nature of the boundaries (stratigraphical or structural) is unknown. Petrofacies II could have been derived from an active magmatic arc and deposited in a forearc basin (sensu lato) or series of basins at a major consuming margin. Petrofacies I is a much more quartzose suite, although otherwise petrographically very similar to petrofacies II. Its depositional setting is ambiguous on the basis of the data presently available, and deposition can only be said to have occurred at either an active or a passive continental margin. Finally, there is the possibility that strike-slip faulting has structurally shuffled the Trinity Peninsula Group, causing the pronounced age and compositional contrasts observed.

  5. Geologic framework of oil and gas genesis in main sedimentary basins from Romania Oprea Dicea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ionescu, N.; Morariu, C.D.

    1991-03-01

    Oil and gas fields located in Moldavic nappes are encompassed in Oligocene and lower Miocene formations, mostly in the marginal folds nappe, where Kliwa Sandstone sequences have high porosity, and in the Black Sea Plateau. The origin of the hydrocarbon accumulations from the Carpathian foredeep seems to be connected to the Oligocene-lower Miocene bituminous formations of the marginal folds and sub-Carpathian nappes. In the Gethic depression, the hydrocarbon accumulations originate in Oligocene and Miocene source rocks and host in structural, stratigraphical, and lithological traps. The accumulations connected with tectonic lines that outline the areal extension of the Oligocene, Miocene, andmore » Pliocene formations are in the underthrusted Moesian platform. The hydrocarbon accumulations related to the Carpathian foreland represent about 40% of all known accumulations in Romania. Most of them are located in the Moesian platform. In this unit, the oil and gas fields present a vertical distribution at different stratigraphic levels, from paleozoic to Neogene, and in all types of reservoirs, suggesting multicycles of oleogenesis, migration, accumulation, and sealing conditions. The hydrocarbon deposits known so far on the Black Sea continental plateau are confined in the Albian, Cenomanian, Turonian-Senonian, and Eocene formations. The traps are of complex type structural, lithologic, and stratigraphic. The reservoirs are sandstones, calcareous sandstones, limestones, and sands. The hydrocarbon source rocks are pelitic and siltic Oligocene formations. Other older source rocks are probably Cretaceous.« less

  6. Invited Article: Concepts and tools for the evaluation of measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Iyer, Hari K.

    2017-01-01

    Measurements involve comparisons of measured values with reference values traceable to measurement standards and are made to support decision-making. While the conventional definition of measurement focuses on quantitative properties (including ordinal properties), we adopt a broader view and entertain the possibility of regarding qualitative properties also as legitimate targets for measurement. A measurement result comprises the following: (i) a value that has been assigned to a property based on information derived from an experiment or computation, possibly also including information derived from other sources, and (ii) a characterization of the margin of doubt that remains about the true value of the property after taking that information into account. Measurement uncertainty is this margin of doubt, and it can be characterized by a probability distribution on the set of possible values of the property of interest. Mathematical or statistical models enable the quantification of measurement uncertainty and underlie the varied collection of methods available for uncertainty evaluation. Some of these methods have been in use for over a century (for example, as introduced by Gauss for the combination of mutually inconsistent observations or for the propagation of "errors"), while others are of fairly recent vintage (for example, Monte Carlo methods including those that involve Markov Chain Monte Carlo sampling). This contribution reviews the concepts, models, methods, and computations that are commonly used for the evaluation of measurement uncertainty, and illustrates their application in realistic examples drawn from multiple areas of science and technology, aiming to serve as a general, widely accessible reference.

  7. Methods for determining time of death.

    PubMed

    Madea, Burkhard

    2016-12-01

    Medicolegal death time estimation must estimate the time since death reliably. Reliability can only be provided empirically by statistical analysis of errors in field studies. Determining the time since death requires the calculation of measurable data along a time-dependent curve back to the starting point. Various methods are used to estimate the time since death. The current gold standard for death time estimation is a previously established nomogram method based on the two-exponential model of body cooling. Great experimental and practical achievements have been realized using this nomogram method. To reduce the margin of error of the nomogram method, a compound method was developed based on electrical and mechanical excitability of skeletal muscle, pharmacological excitability of the iris, rigor mortis, and postmortem lividity. Further increasing the accuracy of death time estimation involves the development of conditional probability distributions for death time estimation based on the compound method. Although many studies have evaluated chemical methods of death time estimation, such methods play a marginal role in daily forensic practice. However, increased precision of death time estimation has recently been achieved by considering various influencing factors (i.e., preexisting diseases, duration of terminal episode, and ambient temperature). Putrefactive changes may be used for death time estimation in water-immersed bodies. Furthermore, recently developed technologies, such as H magnetic resonance spectroscopy, can be used to quantitatively study decompositional changes. This review addresses the gold standard method of death time estimation in forensic practice and promising technological and scientific developments in the field.

  8. Understanding Peripheral Bat Populations Using Maximum-Entropy Suitability Modeling

    PubMed Central

    Barnhart, Paul R.; Gillam, Erin H.

    2016-01-01

    Individuals along the periphery of a species distribution regularly encounter more challenging environmental and climatic conditions than conspecifics near the center of the distribution. Due to these potential constraints, individuals in peripheral margins are expected to change their habitat and behavioral characteristics. Managers typically rely on species distribution maps when developing adequate management practices. However, these range maps are often too simplistic and do not provide adequate information as to what fine-scale biotic and abiotic factors are driving a species occurrence. In the last decade, habitat suitability modelling has become widely used as a substitute for simplistic distribution mapping which allows regional managers the ability to fine-tune management resources. The objectives of this study were to use maximum-entropy modeling to produce habitat suitability models for seven species that have a peripheral margin intersecting the state of North Dakota, according to current IUCN distributions, and determine the vegetative and climatic characteristics driving these models. Mistnetting resulted in the documentation of five species outside the IUCN distribution in North Dakota, indicating that current range maps for North Dakota, and potentially the northern Great Plains, are in need of update. Maximum-entropy modeling showed that temperature and not precipitation were the variables most important for model production. This fine-scale result highlights the importance of habitat suitability modelling as this information cannot be extracted from distribution maps. Our results provide baseline information needed for future research about how and why individuals residing in the peripheral margins of a species’ distribution may show marked differences in habitat use as a result of urban expansion, habitat loss, and climate change compared to more centralized populations. PMID:27935936

  9. Assessment of undiscovered petroleum resources of the Amerasia Basin Petroleum Province

    USGS Publications Warehouse

    Houseknecht, David W.; Bird, Kenneth J.; Garrity, Christopher P.

    2012-01-01

    The Amerasia Basin Petroleum Province encompasses the Canada Basin and the sediment prisms along the Alaska and Canada margins, outboard from basinward margins (hingelines) of the rift shoulders that formed during extensional opening of the Canada Basin. The province includes the Mackenzie delta and slope, the outer shelves and marine slopes along the Arctic margins of Alaska and Canada, and the deep Canada Basin. The province is divided into four assessment units (AUs): (1) The Canning-Mackenzie deformed margin AU is that part of the rifted margin where the Brooks Range orogenic belt has overridden the rift shoulder and is deforming the rifted-margin prism of sediment outboard of the hingeline. This is the only part of the Amerasia Basin Province that has been explored and—even though more than 3 billion barrels of oil equivalent (BBOE) of oil, gas, and condensate have been discovered—none has been commercially produced. (2) The Alaska passive margin AU is the rifted-margin prism of sediment lying beneath the Beaufort outer shelf and slope that has not been deformed by tectonism. (3) The Canada passive margin AU is the rifted-margin prism of sediment lying beneath the Arctic outer shelf and slope (also known as the polar margin) of Canada that has not been deformed by tectonism. (4) The Canada Basin AU includes the sediment wedge that lies beneath the deep Canada Basin, north of the marine slope developed along the Alaska and Canada margins. Mean estimates of risked, undiscovered, technically recoverable resources include more than 6 billion barrels of oil (BBO), more than 19 trillion cubic feet (TCF) of associated gas, and more than 16 TCF of nonassociated gas in the Canning-Mackenzie deformed margin AU; about 1 BBO, about 3 TCF of associated gas, and about 3 TCF of nonassociated gas in the Alaska passive margin AU; and more than 2 BBO, about 7 TCF of associated gas, and about 8 TCF of nonassociated gas in the Canada passive margin AU. Quantities of natural gas liquids also are assessed in each AU. The Canada Basin AU was not quantitatively assessed because it is judged to hold less than 10 percent probability of containing at least one accumulation of 50 million barrels of oil equivalent.

  10. Geophysical evidence for the crustal variation and distribution of magmatism along the central coast of Mozambique

    NASA Astrophysics Data System (ADS)

    Mueller, Christian Olaf; Jokat, Wilfried

    2017-08-01

    For our understanding of the timing and geometry of the initial Gondwana break-up, still a consistent image of the crustal composition of the conjugated margins of central Mozambique and Antarctica and the location of their continent-ocean boundaries is missing. In this regard, a main objective is the explanation for the source of the different magnetic signature of the conjugate margins. Based on a revised investigation of wide-angle seismic data along two profiles across the Mozambican margin by means of an amplitude modelling, this study presents the crustal composition across and along the continental margin of central Mozambique. Supported by 2D magnetic modelling, the results are compared to the conjugate margin in Antarctica and allow new conclusions about their joined tectonic evolution. An observed crustal diversity between the north-eastern and south-western parts of the central Mozambican margin, testifies to the complex break-up history of this area. Conspicuous is the equal spatial extent of the HVLCB along the margin of 190-215 km. The onset of oceanic crust at the central Mozambican margin is refined to chron M38n.2n (164.1 Ma). Magnetic modelling supports the presence of reversed polarized SDRs in the continent-ocean transition that were mainly emplaced between 168.5 and 166.8 Ma (M42-M40). Inferred SDRs in the Riiser-Larsen Sea might be emplaced sometime between 166.8 and 164.1 Ma (M39-M38), but got overprinted by normal polarized intrusions of a late stage of rift volcanism, causing the opposite magnetic signature of the conjugate margins. The distribution of the magmatic material along the central coast of Mozambique clearly indicates the eastern extension of the north-eastern branch of the Karoo triple rift along the entire margin. The main magmatic phase affecting this area lasted for at least 12 Myr between 169 and 157 Ma, followed by the cease of the magmatism, perhaps due to the relative southwards motion of the magmatic centre.

  11. Circumferential resection margin positivity after preoperative chemoradiotherapy based on magnetic resonance imaging for locally advanced rectal cancer: implication of boost radiotherapy to the involved mesorectal fascia.

    PubMed

    Kim, Kyung Hwan; Park, Min Jung; Lim, Joon Seok; Kim, Nam Kyu; Min, Byung Soh; Ahn, Joong Bae; Kim, Tae Il; Kim, Ho Geun; Koom, Woong Sub

    2016-04-01

    To identify patients who are at a higher risk of pathologic circumferential resection margin involvement using preoperative magnetic resonance imaging. Between October 2008 and November 2012, 165 patients with locally advanced rectal cancer (cT4 or cT3 with <2 mm distance from tumour to mesorectal fascia) who received preoperative chemoradiotherapy were analysed. The morphologic patterns on post-chemoradiotherapy magnetic resonance imaging were categorized into five patterns from Pattern A (most-likely negative pathologic circumferential resection margin) to Pattern E (most-likely positive pathologic circumferential resection margin). In addition, the location of mesorectal fascia involvement was classified as lateral, posterior and anterior. The diagnostic accuracy of the morphologic criteria was calculated using receiver operating characteristic curve analysis. Pathologic circumferential resection margin involvement was identified in 17 patients (10.3%). The diagnostic accuracy of predicting pathologic circumferential resection margin involvement was 0.73 using the five-scale magnetic resonance imaging pattern. The sensitivity, specificity, positive predictive value and negative predictive value for predicting pathologic circumferential resection margin involvement were 76.5, 65.5, 20.3 and 96.0%, respectively, when cut-off was set between Patterns C and D. On multivariate logistic regression, the magnetic resonance imaging patterns D and E (P= 0.005) and posterior or lateral mesorectal fascia involvement (P= 0.017) were independently associated with increased probability of pathologic circumferential resection margin involvement. The rate of pathologic circumferential resection margin involvement was 30.0% when the patient had Pattern D or E with posterior or lateral mesorectal fascia involvement. Patients who are at a higher risk of pathologic circumferential resection margin involvement can be identified using preoperative magnetic resonance imaging although the predictability is moderate. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. New Insights into Passive Margin Development from a Global Deep Seismic Reflection Dataset

    NASA Astrophysics Data System (ADS)

    Bellingham, Paul; Pindell, James; Graham, Rod; Horn, Brian

    2014-05-01

    The kinematic and dynamic evolution of the world's passive margins is still poorly understood. Yet the need to replace reserves, a high oil price and advances in drilling technology have pushed the international oil and gas industry to explore in the deep and ultra-deep waters of the continental margins. To support this exploration and help understand these margins, ION-GXT has acquired, processed and interpreted BasinSPAN surveys across many of the world's passive margins. Observations from these data lead us to consider the modes of subsidence and uplift at both volcanic and non-volcanic margins. At non-volcanic margins, it appears that frequently much of the subsidence post-dates major rifting and is not thermal in origin. Rather the subsidence is associated with extensional displacement on a major fault or shear zone running at least as deep as the continental Moho. We believe that the subsidence is structural and is probably associated with the pinching out (boudinage) of the Lower Crust so that the Upper crust effectively collapses onto the mantle. Eventually this will lead to the exhumation of the sub-continental mantle at the sea bed. Volcanic margins present more complex challenges both in terms of imaging and interpretation. The addition of volcanic and plutonic material into the system and dynamic effects all impact subsidence and uplift. However, we will show some fundamental observations regarding the kinematic development of volcanic margins and especially SDRs which demonstate that the process of collapse and the development of shear zones within and below the crust are also in existence at this type of margin. A model is presented of 'magma welds' whereby packages of SDRs collapse onto an emerging sub-crustal shear zone and it is this collapse which creates the commonly observed SDR geometry. Examples will be shown from East India, Newfoundland, Brazil, Argentina and the Gulf of Mexico.

  13. Inclusion of Radiation Environment Variability in Total Dose Hardness Assurance Methodology

    NASA Technical Reports Server (NTRS)

    Xapsos, M. A.; Stauffer, C.; Phan, A.; McClure, S. S.; Ladbury, R. L.; Pellish, J. A.; Campola, M. J.; LaBel, K. A.

    2015-01-01

    Variability of the space radiation environment is investigated with regard to parts categorization for total dose hardness assurance methods. It is shown that it can have a significant impact. A modified approach is developed that uses current environment models more consistently and replaces the design margin concept with one of failure probability.

  14. High-Dimensional Exploratory Item Factor Analysis by a Metropolis-Hastings Robbins-Monro Algorithm

    ERIC Educational Resources Information Center

    Cai, Li

    2010-01-01

    A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…

  15. New tax law hobbles tax-exempt hospitals.

    PubMed

    Goldblatt, S J

    1982-03-01

    The Economic Recovery Tax Act of 1981 left tax-exempt hospitals at a significant disadvantage in the competition for capital. Although the new law's accelerated depreciation schedules and liberalized investment tax credits contain some marginal benefits for tax-exempt hospitals, these benefits are probably more than offset by the impact of the law on charitable giving.

  16. Large margin nearest neighbor classifiers.

    PubMed

    Domeniconi, Carlotta; Gunopulos, Dimitrios; Peng, Jing

    2005-07-01

    The nearest neighbor technique is a simple and appealing approach to addressing classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. The employment of a locally adaptive metric becomes crucial in order to keep class conditional probabilities close to uniform, thereby minimizing the bias of estimates. We propose a technique that computes a locally flexible metric by means of support vector machines (SVMs). The decision function constructed by SVMs is used to determine the most discriminant direction in a neighborhood around the query. Such a direction provides a local feature weighting scheme. We formally show that our method increases the margin in the weighted space where classification takes place. Moreover, our method has the important advantage of online computational efficiency over competing locally adaptive techniques for nearest neighbor classification. We demonstrate the efficacy of our method using both real and simulated data.

  17. Mental health status and healthcare utilization among community dwelling older adults.

    PubMed

    Adepoju, Omolola; Lin, Szu-Hsuan; Mileski, Michael; Kruse, Clemens Scott; Mask, Andrew

    2018-04-27

    Shifts in mental health utilization patterns are necessary to allow for meaningful access to care for vulnerable populations. There have been long standing issues in how mental health is provided, which has caused problems in that care being efficacious for those seeking it. To assess the relationship between mental health status and healthcare utilization among adults ≥65 years. A negative binomial regression model was used to assess the relationship between mental health status and healthcare utilization related to office-based physician visits, while a two-part model, consisting of logistic regression and negative binomial regression, was used to separately model emergency visits and inpatient services. The receipt of care in office-based settings were marginally higher for subjects with mental health difficulties. Both probabilities and counts of inpatient hospitalizations were similar across mental health categories. The count of ER visits was similar across mental health categories; however, the probability of having an emergency department visit was marginally higher for older adults who reported mental health difficulties in 2012. These findings are encouraging and lend promise to the recent initiatives on addressing gaps in mental healthcare services.

  18. Structure and regional significance of the Late Permian(?) Sierra Nevada - Death Valley thrust system, east-central California

    USGS Publications Warehouse

    Stevens, C.H.; Stone, P.

    2005-01-01

    An imbricate system of north-trending, east-directed thrust faults of late Early Permian to middle Early Triassic (most likely Late Permian) age forms a belt in east-central California extending from the Mount Morrison roof pendant in the eastern Sierra Nevada to Death Valley. Six major thrust faults typically with a spacing of 15-20 km, original dips probably of 25-35??, and stratigraphic throws of 2-5 km compose this structural belt, which we call the Sierra Nevada-Death Valley thrust system. These thrusts presumably merge into a de??collement at depth, perhaps at the contact with crystalline basement, the position of which is unknown. We interpret the deformation that produced these thrusts to have been related to the initiation of convergent plate motion along a southeast-trending continental margin segment probably formed by Pennsylvanian transform truncation. This deformation apparently represents a period of tectonic transition to full-scale convergence and arc magmatism along the continental margin beginning in the Late Triassic in central California. ?? 2005 Elsevier B.V. All rights reserved.

  19. Fast changes in seasonal forest communities due to soil moisture increase after damming.

    PubMed

    do Vale, Vagner Santiago; Schiavini, Ivan; Araújo, Glein Monteiro; Gusson, André Eduardo; Lopes, Sérgio de Faria; de Oliveira, Ana Paula; do Prado-Júnior, Jamir Afonso; Arantes, Carolina de Silvério; Dias-Neto, Olavo Custodio

    2013-12-01

    Local changes caused by dams can have drastic consequences for ecosystems, not only because they change the water regime but also the modification on lakeshore areas. Thus, this work aimed to determine the changes in soil moisture after damming, to understand the consequences of this modification on the arboreal community of dry forests, some of the most endangered systems on the planet. We studied these changes in soil moisture and the arboreal community in three dry forests in the Araguari River Basin, after two dams construction in 2005 and 2006, and the potential effects on these forests. For this, plots of 20 m x 10 m were distributed close to the impoundment margin and perpendicular to the dam margin in two deciduous dry forests and one semi-deciduous dry forest located in Southeastern Brazil, totaling 3.6 ha sampled. Besides, soil analysis were undertaken before and after impoundment at three different depths (0-10, 20-30 and 40-50 cm). A tree (minimum DBH of 4.77 cm) community inventory was made before (TO) and at two (T2) and four (T4) years after damming. Annual dynamic rates of all communities were calculated, and statistical tests were used to determine changes in soil moisture and tree communities. The analyses confirmed soil moisture increases in all forests, especially during the dry season and at sites closer to the reservoir; besides, an increase in basal area due to the fast growth of many trees was observed. The highest turnover occurred in the first two years after impoundment, mainly due to the higher tree mortality especially of those closer to the dam margin. All forests showed reductions in dynamic rates for subsequent years (T2-T4), indicating that these forests tended to stabilize after a strong initial impact. The modifications were more extensive in the deciduous forests, probably because the dry period resulted more rigorous in these forests when compared to semideciduous forest. The new shorelines created by damming increased soil moisture in the dry season, making plant growth easier. We concluded that several changes occurred in the T0-T2 period and at 0-30 m to the impoundment, mainly for the deciduous forests, where this community turned into a "riparian-deciduous forest" with large basal area in these patches. However, unlike other transitory disturbances, damming is a permanent alteration and transforms the landscape to a different scenario, probably with major long-term consequences for the environment.

  20. Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.

    PubMed

    Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi

    2015-02-01

    We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.

  1. Distribution and depth of bottom-simulating reflectors in the Nankai subduction margin.

    PubMed

    Ohde, Akihiro; Otsuka, Hironori; Kioka, Arata; Ashi, Juichiro

    2018-01-01

    Surface heat flow has been observed to be highly variable in the Nankai subduction margin. This study presents an investigation of local anomalies in surface heat flows on the undulating seafloor in the Nankai subduction margin. We estimate the heat flows from bottom-simulating reflectors (BSRs) marking the lower boundaries of the methane hydrate stability zone and evaluate topographic effects on heat flow via two-dimensional thermal modeling. BSRs have been used to estimate heat flows based on the known stability characteristics of methane hydrates under low-temperature and high-pressure conditions. First, we generate an extensive map of the distribution and subseafloor depths of the BSRs in the Nankai subduction margin. We confirm that BSRs exist at the toe of the accretionary prism and the trough floor of the offshore Tokai region, where BSRs had previously been thought to be absent. Second, we calculate the BSR-derived heat flow and evaluate the associated errors. We conclude that the total uncertainty of the BSR-derived heat flow should be within 25%, considering allowable ranges in the P-wave velocity, which influences the time-to-depth conversion of the BSR position in seismic images, the resultant geothermal gradient, and thermal resistance. Finally, we model a two-dimensional thermal structure by comparing the temperatures at the observed BSR depths with the calculated temperatures at the same depths. The thermal modeling reveals that most local variations in BSR depth over the undulating seafloor can be explained by topographic effects. Those areas that cannot be explained by topographic effects can be mainly attributed to advective fluid flow, regional rapid sedimentation, or erosion. Our spatial distribution of heat flow data provides indispensable basic data for numerical studies of subduction zone modeling to evaluate margin parallel age dependencies of subducting plates.

  2. Margin based ontology sparse vector learning algorithm and applied in biology science.

    PubMed

    Gao, Wei; Qudair Baig, Abdul; Ali, Haidar; Sajjad, Wasim; Reza Farahani, Mohammad

    2017-01-01

    In biology field, the ontology application relates to a large amount of genetic information and chemical information of molecular structure, which makes knowledge of ontology concepts convey much information. Therefore, in mathematical notation, the dimension of vector which corresponds to the ontology concept is often very large, and thus improves the higher requirements of ontology algorithm. Under this background, we consider the designing of ontology sparse vector algorithm and application in biology. In this paper, using knowledge of marginal likelihood and marginal distribution, the optimized strategy of marginal based ontology sparse vector learning algorithm is presented. Finally, the new algorithm is applied to gene ontology and plant ontology to verify its efficiency.

  3. Variations of mesoscale and large-scale sea ice morphology in the 1984 Marginal Ice Zone Experiment as observed by microwave remote sensing

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Josberger, E. G.; Gloersen, P.; Johannessen, O. M.; Guest, P. S.

    1987-01-01

    The data acquired during the summer 1984 Marginal Ice Zone Experiment in the Fram Strait-Greenland Sea marginal ice zone, using airborne active and passive microwave sensors and the Nimbus 7 SMMR, were analyzed to compile a sequential description of the mesoscale and large-scale ice morphology variations during the period of June 6 - July 16, 1984. Throughout the experiment, the long ice edge between northwest Svalbard and central Greenland meandered; eddies were repeatedly formed, moved, and disappeared but the ice edge remained within a 100-km-wide zone. The ice pack behind this alternately diffuse and compact edge underwent rapid and pronounced variations in ice concentration over a 200-km-wide zone. The high-resolution ice concentration distributions obtained in the aircraft images agree well with the low-resolution distributions of SMMR images.

  4. Cascadia Gas Vent Distribution and Challenges to Quantify Margin-Wide Methane Fluxes

    NASA Astrophysics Data System (ADS)

    Scherwath, M.; Riedel, M.; Roemer, M.; Veloso, M.; Heesemann, M.; Spence, G.

    2017-12-01

    Gas venting along the Cascadia Margin has been mapped over decades with ship sonar and in recent years with permanent seafloor installations utilizing the seafloor observatories NEPTUNE of Ocean Networks Canada and the Cabled Array of the Ocean Observatories Initiative. We show the distribution of over 1000 vents, most on the shallow shelf. For a third of the vents we have estimated methane flow rates, ranging from 0.05 to 69 L/min, and extrapolate these results to a margin-wide methane flow estaimate of around 4 Mt/yr (at surface pressure and temperature) and a flux estimate of 0.05 kg yr-1 m-2. However, these estimates are based on several assumptions, e.g. bubble sizes or data coverage, providing large uncertainties. With continued research expeditions and potential seafloor calibration experiments, these data can be refined and improved in future years.

  5. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    USGS Publications Warehouse

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving more confidence in the estimate, and we use those data to calculate the annual probability of a large eruption in the next year at 1.4x10-5.

  6. Burst wait time simulation of CALIBAN reactor at delayed super-critical state

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humbert, P.; Authier, N.; Richard, B.

    2012-07-01

    In the past, the super prompt critical wait time probability distribution was measured on CALIBAN fast burst reactor [4]. Afterwards, these experiments were simulated with a very good agreement by solving the non-extinction probability equation [5]. Recently, the burst wait time probability distribution has been measured at CEA-Valduc on CALIBAN at different delayed super-critical states [6]. However, in the delayed super-critical case the non-extinction probability does not give access to the wait time distribution. In this case it is necessary to compute the time dependent evolution of the full neutron count number probability distribution. In this paper we present themore » point model deterministic method used to calculate the probability distribution of the wait time before a prescribed count level taking into account prompt neutrons and delayed neutron precursors. This method is based on the solution of the time dependent adjoint Kolmogorov master equations for the number of detections using the generating function methodology [8,9,10] and inverse discrete Fourier transforms. The obtained results are then compared to the measurements and Monte-Carlo calculations based on the algorithm presented in [7]. (authors)« less

  7. Steady state, relaxation and first-passage properties of a run-and-tumble particle in one-dimension

    NASA Astrophysics Data System (ADS)

    Malakar, Kanaya; Jemseena, V.; Kundu, Anupam; Vijay Kumar, K.; Sabhapandit, Sanjib; Majumdar, Satya N.; Redner, S.; Dhar, Abhishek

    2018-04-01

    We investigate the motion of a run-and-tumble particle (RTP) in one dimension. We find the exact probability distribution of the particle with and without diffusion on the infinite line, as well as in a finite interval. In the infinite domain, this probability distribution approaches a Gaussian form in the long-time limit, as in the case of a regular Brownian particle. At intermediate times, this distribution exhibits unexpected multi-modal forms. In a finite domain, the probability distribution reaches a steady-state form with peaks at the boundaries, in contrast to a Brownian particle. We also study the relaxation to the steady-state analytically. Finally we compute the survival probability of the RTP in a semi-infinite domain with an absorbing boundary condition at the origin. In the finite interval, we compute the exit probability and the associated exit times. We provide numerical verification of our analytical results.

  8. Efficient marginalization to compute protein posterior probabilities from shotgun mass spectrometry data

    PubMed Central

    Serang, Oliver; MacCoss, Michael J.; Noble, William Stafford

    2010-01-01

    The problem of identifying proteins from a shotgun proteomics experiment has not been definitively solved. Identifying the proteins in a sample requires ranking them, ideally with interpretable scores. In particular, “degenerate” peptides, which map to multiple proteins, have made such a ranking difficult to compute. The problem of computing posterior probabilities for the proteins, which can be interpreted as confidence in a protein’s presence, has been especially daunting. Previous approaches have either ignored the peptide degeneracy problem completely, addressed it by computing a heuristic set of proteins or heuristic posterior probabilities, or by estimating the posterior probabilities with sampling methods. We present a probabilistic model for protein identification in tandem mass spectrometry that recognizes peptide degeneracy. We then introduce graph-transforming algorithms that facilitate efficient computation of protein probabilities, even for large data sets. We evaluate our identification procedure on five different well-characterized data sets and demonstrate our ability to efficiently compute high-quality protein posteriors. PMID:20712337

  9. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  10. Recovery of Graded Response Model Parameters: A Comparison of Marginal Maximum Likelihood and Markov Chain Monte Carlo Estimation

    ERIC Educational Resources Information Center

    Kieftenbeld, Vincent; Natesan, Prathiba

    2012-01-01

    Markov chain Monte Carlo (MCMC) methods enable a fully Bayesian approach to parameter estimation of item response models. In this simulation study, the authors compared the recovery of graded response model parameters using marginal maximum likelihood (MML) and Gibbs sampling (MCMC) under various latent trait distributions, test lengths, and…

  11. An Efficient MCMC Algorithm to Sample Binary Matrices with Fixed Marginals

    ERIC Educational Resources Information Center

    Verhelst, Norman D.

    2008-01-01

    Uniform sampling of binary matrices with fixed margins is known as a difficult problem. Two classes of algorithms to sample from a distribution not too different from the uniform are studied in the literature: importance sampling and Markov chain Monte Carlo (MCMC). Existing MCMC algorithms converge slowly, require a long burn-in period and yield…

  12. Estimating the Parameters of the Beta-Binomial Distribution.

    ERIC Educational Resources Information Center

    Wilcox, Rand R.

    1979-01-01

    For some situations the beta-binomial distribution might be used to describe the marginal distribution of test scores for a particular population of examinees. Several different methods of approximating the maximum likelihood estimate were investigated, and it was found that the Newton-Raphson method should be used when it yields admissable…

  13. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    ERIC Educational Resources Information Center

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  14. Numeric stratigraphic modeling: Testing sequence Numeric stratigraphic modeling: Testing sequence stratigraphic concepts using high resolution geologic examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armentrout, J.M.; Smith-Rouch, L.S.; Bowman, S.A.

    1996-08-01

    Numeric simulations based on integrated data sets enhance our understanding of depositional geometry and facilitate quantification of depositional processes. Numeric values tested against well-constrained geologic data sets can then be used in iterations testing each variable, and in predicting lithofacies distributions under various depositional scenarios using the principles of sequence stratigraphic analysis. The stratigraphic modeling software provides a broad spectrum of techniques for modeling and testing elements of the petroleum system. Using well-constrained geologic examples, variations in depositional geometry and lithofacies distributions between different tectonic settings (passive vs. active margin) and climate regimes (hothouse vs. icehouse) can provide insight tomore » potential source rock and reservoir rock distribution, maturation timing, migration pathways, and trap formation. Two data sets are used to illustrate such variations: both include a seismic reflection profile calibrated by multiple wells. The first is a Pennsylvanian mixed carbonate-siliciclastic system in the Paradox basin, and the second a Pliocene-Pleistocene siliciclastic system in the Gulf of Mexico. Numeric simulations result in geometry and facies distributions consistent with those interpreted using the integrated stratigraphic analysis of the calibrated seismic profiles. An exception occurs in the Gulf of Mexico study where the simulated sediment thickness from 3.8 to 1.6 Ma within an upper slope minibasin was less than that mapped using a regional seismic grid. Regional depositional patterns demonstrate that this extra thickness was probably sourced from out of the plane of the modeled transect, illustrating the necessity for three-dimensional constraints on two-dimensional modeling.« less

  15. A Genealogical Look at Shared Ancestry on the X Chromosome.

    PubMed

    Buffalo, Vince; Mount, Stephen M; Coop, Graham

    2016-09-01

    Close relatives can share large segments of their genome identical by descent (IBD) that can be identified in genome-wide polymorphism data sets. There are a range of methods to use these IBD segments to identify relatives and estimate their relationship. These methods have focused on sharing on the autosomes, as they provide a rich source of information about genealogical relationships. We hope to learn additional information about recent ancestry through shared IBD segments on the X chromosome, but currently lack the theoretical framework to use this information fully. Here, we fill this gap by developing probability distributions for the number and length of X chromosome segments shared IBD between an individual and an ancestor k generations back, as well as between half- and full-cousin relationships. Due to the inheritance pattern of the X and the fact that X homologous recombination occurs only in females (outside of the pseudoautosomal regions), the number of females along a genealogical lineage is a key quantity for understanding the number and length of the IBD segments shared among relatives. When inferring relationships among individuals, the number of female ancestors along a genealogical lineage will often be unknown. Therefore, our IBD segment length and number distributions marginalize over this unknown number of recombinational meioses through a distribution of recombinational meioses we derive. By using Bayes' theorem to invert these distributions, we can estimate the number of female ancestors between two relatives, giving us details about the genealogical relations between individuals not possible with autosomal data alone. Copyright © 2016 by the Genetics Society of America.

  16. Cluster analysis of European Y-chromosomal STR haplotypes using the discrete Laplace method.

    PubMed

    Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels

    2014-07-01

    The European Y-chromosomal short tandem repeat (STR) haplotype distribution has previously been analysed in various ways. Here, we introduce a new way of analysing population substructure using a new method based on clustering within the discrete Laplace exponential family that models the probability distribution of the Y-STR haplotypes. Creating a consistent statistical model of the haplotypes enables us to perform a wide range of analyses. Previously, haplotype frequency estimation using the discrete Laplace method has been validated. In this paper we investigate how the discrete Laplace method can be used for cluster analysis to further validate the discrete Laplace method. A very important practical fact is that the calculations can be performed on a normal computer. We identified two sub-clusters of the Eastern and Western European Y-STR haplotypes similar to results of previous studies. We also compared pairwise distances (between geographically separated samples) with those obtained using the AMOVA method and found good agreement. Further analyses that are impossible with AMOVA were made using the discrete Laplace method: analysis of the homogeneity in two different ways and calculating marginal STR distributions. We found that the Y-STR haplotypes from e.g. Finland were relatively homogeneous as opposed to the relatively heterogeneous Y-STR haplotypes from e.g. Lublin, Eastern Poland and Berlin, Germany. We demonstrated that the observed distributions of alleles at each locus were similar to the expected ones. We also compared pairwise distances between geographically separated samples from Africa with those obtained using the AMOVA method and found good agreement. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Heterogeneous rupture in the great Cascadia earthquake of 1700 inferred from coastal subsidence estimates

    USGS Publications Warehouse

    Wang, Pei-Ling; Engelhart, Simon E.; Wang, Kelin; Hawkes, Andrea D.; Horton, Benjamin P.; Nelson, Alan R.; Witter, Robert C.

    2013-01-01

    Past earthquake rupture models used to explain paleoseismic estimates of coastal subsidence during the great A.D. 1700 Cascadia earthquake have assumed a uniform slip distribution along the megathrust. Here we infer heterogeneous slip for the Cascadia margin in A.D. 1700 that is analogous to slip distributions during instrumentally recorded great subduction earthquakes worldwide. The assumption of uniform distribution in previous rupture models was due partly to the large uncertainties of then available paleoseismic data used to constrain the models. In this work, we use more precise estimates of subsidence in 1700 from detailed tidal microfossil studies. We develop a 3-D elastic dislocation model that allows the slip to vary both along strike and in the dip direction. Despite uncertainties in the updip and downdip slip extensions, the more precise subsidence estimates are best explained by a model with along-strike slip heterogeneity, with multiple patches of high-moment release separated by areas of low-moment release. For example, in A.D. 1700, there was very little slip near Alsea Bay, Oregon (~44.4°N), an area that coincides with a segment boundary previously suggested on the basis of gravity anomalies. A probable subducting seamount in this area may be responsible for impeding rupture during great earthquakes. Our results highlight the need for more precise, high-quality estimates of subsidence or uplift during prehistoric earthquakes from the coasts of southern British Columbia, northern Washington (north of 47°N), southernmost Oregon, and northern California (south of 43°N), where slip distributions of prehistoric earthquakes are poorly constrained.

  18. Probability distributions of the electroencephalogram envelope of preterm infants.

    PubMed

    Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro

    2015-06-01

    To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  19. Marginal Structural Models for Case-Cohort Study Designs to Estimate the Association of Antiretroviral Therapy Initiation With Incident AIDS or Death

    PubMed Central

    Cole, Stephen R.; Hudgens, Michael G.; Tien, Phyllis C.; Anastos, Kathryn; Kingsley, Lawrence; Chmiel, Joan S.; Jacobson, Lisa P.

    2012-01-01

    To estimate the association of antiretroviral therapy initiation with incident acquired immunodeficiency syndrome (AIDS) or death while accounting for time-varying confounding in a cost-efficient manner, the authors combined a case-cohort study design with inverse probability-weighted estimation of a marginal structural Cox proportional hazards model. A total of 950 adults who were positive for human immunodeficiency virus type 1 were followed in 2 US cohort studies between 1995 and 2007. In the full cohort, 211 AIDS cases or deaths occurred during 4,456 person-years. In an illustrative 20% random subcohort of 190 participants, 41 AIDS cases or deaths occurred during 861 person-years. Accounting for measured confounders and determinants of dropout by inverse probability weighting, the full cohort hazard ratio was 0.41 (95% confidence interval: 0.26, 0.65) and the case-cohort hazard ratio was 0.47 (95% confidence interval: 0.26, 0.83). Standard multivariable-adjusted hazard ratios were closer to the null, regardless of study design. The precision lost with the case-cohort design was modest given the cost savings. Results from Monte Carlo simulations demonstrated that the proposed approach yields approximately unbiased estimates of the hazard ratio with appropriate confidence interval coverage. Marginal structural model analysis of case-cohort study designs provides a cost-efficient design coupled with an accurate analytic method for research settings in which there is time-varying confounding. PMID:22302074

  20. Establishment of HPC(R2A) for regrowth control in non-chlorinated distribution systems.

    PubMed

    Uhl, Wolfgang; Schaule, Gabriela

    2004-05-01

    Drinking water distributed without disinfection and without regrowth problems for many years may show bacterial regrowth when the residence time and/or temperature in the distribution system increases or when substrate and/or bacterial concentration in the treated water increases. An example of a regrowth event in a major German city is discussed. Regrowth of HPC bacteria occurred unexpectedly at the end of a very hot summer. No pathogenic or potentially pathogenic bacteria were identified. Increased residence times in the distribution system and temperatures up to 25 degrees C were identified as most probable causes and the regrowth event was successfully overcome by changing flow regimes and decreasing residence times. Standard plate counts of HPC bacteria using the spread plate technique on nutrient rich agar according to German Drinking Water Regulations (GDWR) had proven to be a very good indicator of hygienically safe drinking water and to demonstrate the effectiveness of water treatment. However, the method proved insensitive for early regrowth detection. Regrowth experiments in the lab and sampling of the distribution system during two summers showed that spread plate counts on nutrient-poor R2A agar after 7-day incubation yielded 100 to 200 times higher counts. Counts on R2A after 3-day incubation were three times less than after 7 days. As the precision of plate count methods is very poor for counts less than 10 cfu/plate, a method yielding higher counts is better suited to detect upcoming regrowth than a method yielding low counts. It is shown that for the identification of regrowth events HPC(R2A) gives a further margin of about 2 weeks for reaction before HPC(GDWR). Copyright 2003 Elsevier B.V.

  1. Stability evaluation of short-circuiting gas metal arc welding based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Huang, Yong; Wang, Kehong; Zhou, Zhilan; Zhou, Xiaoxiao; Fang, Jimi

    2017-03-01

    The arc of gas metal arc welding (GMAW) contains abundant information about its stability and droplet transition, which can be effectively characterized by extracting the arc electrical signals. In this study, ensemble empirical mode decomposition (EEMD) was used to evaluate the stability of electrical current signals. The welding electrical signals were first decomposed by EEMD, and then transformed to a Hilbert-Huang spectrum and a marginal spectrum. The marginal spectrum is an approximate distribution of amplitude with frequency of signals, and can be described by a marginal index. Analysis of various welding process parameters showed that the marginal index of current signals increased when the welding process was more stable, and vice versa. Thus EEMD combined with the marginal index can effectively uncover the stability and droplet transition of GMAW.

  2. A corrected formulation for marginal inference derived from two-part mixed models for longitudinal semi-continuous data.

    PubMed

    Tom, Brian Dm; Su, Li; Farewell, Vernon T

    2016-10-01

    For semi-continuous data which are a mixture of true zeros and continuously distributed positive values, the use of two-part mixed models provides a convenient modelling framework. However, deriving population-averaged (marginal) effects from such models is not always straightforward. Su et al. presented a model that provided convenient estimation of marginal effects for the logistic component of the two-part model but the specification of marginal effects for the continuous part of the model presented in that paper was based on an incorrect formulation. We present a corrected formulation and additionally explore the use of the two-part model for inferences on the overall marginal mean, which may be of more practical relevance in our application and more generally. © The Author(s) 2013.

  3. A Method for the Selection of Exploration Areas for Unconformity Uranium Deposits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, DeVerle P.; Zaluski, Gerard; Marlatt, James

    2009-06-15

    The method we propose employs two analyses: (1) exploration simulation and risk valuation and (2) portfolio optimization. The first analysis, implemented by the investment worth system (IWS), uses Monte Carlo simulation to integrate a wide spectrum of uncertain and varied components to a relative frequency histogram for net present value of the exploration investment, which is converted to a risk-adjusted value (RAV). Iterative rerunning of the IWS enables the mapping of the relationship of RAV to magnitude of exploration expenditure, X. The second major analysis uses RAV vs. X maps to identify that subset (portfolio) of areas that maximizes themore » RAV of the firm's multiyear exploration budget. The IWS, which is demonstrated numerically, consists of six components based on the geologic description of a hypothetical basin and project area (PA) and a mix of hypothetical and actual conditions of an unidentified area. The geology is quantified and processed by Bayesian belief networks to produce the geology-based inputs required by the IWS. An exploration investment of $60 M produced a highly skewed distribution of net present value (NPV), having mean and median values of $4,160 M and $139 M, respectively. For hypothetical mining firm Minex, the RAV of the exploration investment of $60 M is only $110.7 M. An RAV that is less than 3% of mean NPV reflects the aversion by Minex to risk as well as the magnitude of risk implicit to the highly skewed NPV distribution and the probability of 0.45 for capital loss. Potential benefits of initiating exploration of a portfolio of areas, as contrasted with one area, include increased marginal productivity of exploration as well as reduced probability for nondiscovery. For an exogenously determined multiyear exploration budget, a conceptual framework for portfolio optimization is developed based on marginal RAV exploration products for candidate PAs. PORTFOLIO, a software developed to implement optimization, allocates exploration to PAs so that the RAV of the exploration budget is maximized. Moreover, PORTFOLIO provides a means to examine the impact of magnitude of budget on the composition of the exploration portfolio and the optimum allocation of exploration to PAs that comprise the portfolio. Using fictitious data for five PAs, a numerical demonstration is provided of the use of PORTFOLIO to identify those PAs that comprise the optimum exploration portfolio and to optimally allocate the multiyear budget across portfolio PAs.« less

  4. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    NASA Astrophysics Data System (ADS)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  5. Leptospirosis risk around a potential source of infection

    NASA Astrophysics Data System (ADS)

    Loaiza-Echeverry, Erica; Hincapié-Palacio, Doracelly; Ochoa Acosta, Jesús; Ospina Giraldo, Juan

    2015-05-01

    Leptospirosis is a bacterial zoonosis with world distribution and multiform clinical spectrum in men and animals. The etiology of this disease is the pathogenic species of Leptospira, which cause diverse manifestations of the disease, from mild to serious, such as the Weil disease and the lung hemorrhagic syndrome with lethal proportions of 10% - 50%. This is an emerging problem of urban health due to the growth of marginal neighborhoods without basic sanitary conditions and an increased number of rodents. The presence of rodents and the probability of having contact with their urine determine the likelihood for humans to get infected. In this paper, we simulate the spatial distribution of risk infection of human leptospirosis according to the proximity to rodent burrows considered as potential source of infection. The Bessel function K0 with an r distance from the potential point source, and the scale parameter α in meters was used. Simulation inputs were published data of leptospirosis incidence rate (range of 5 to 79 x 10 000), and a distance of 100 to 5000 meters from the source of infection. We obtained an adequate adjustment between the function and the simulated data. The risk of infection increases with the proximity of the potential source. This estimation can become a guide to propose effective measures of control and prevention.

  6. Thermo-mechanical Design Methodology for ITER Cryodistribution cold boxes

    NASA Astrophysics Data System (ADS)

    Shukla, Vinit; Patel, Pratik; Das, Jotirmoy; Vaghela, Hitensinh; Bhattacharya, Ritendra; Shah, Nitin; Choukekar, Ketan; Chang, Hyun-Sik; Sarkar, Biswanath

    2017-04-01

    The ITER cryo-distribution (CD) system is in charge of proper distribution of the cryogen at required mass flow rate, pressure and temperature level to the users; namely the superconducting (SC) magnets and cryopumps (CPs). The CD system is also capable to use the magnet structures as a thermal buffer in order to operate the cryo-plant as much as possible at a steady state condition. A typical CD cold box is equipped with mainly liquid helium (LHe) bath, heat exchangers (HX’s), cryogenic valves, filter, heaters, cold circulator, cold compressor and process piping. The various load combinations which are likely to occur during the life cycle of the CD cold boxes are imposed on the representative model and impacts on the system are analyzed. This study shows that break of insulation vacuum during nominal operation (NO) along with seismic event (Seismic Level-2) is the most stringent load combination having maximum stress of 224 MPa. However, NO+SMHV (Séismes Maximaux Historiquement Vraisemblables = Maximum Historically Probable Earthquakes) load combination is having the least safety margin and will lead the basis of the design of the CD system and its sub components. This paper presents and compares the results of different load combinations which are likely to occur on a typical CD cold box.

  7. A Bayesian Hierarchical Model for Glacial Dynamics Based on the Shallow Ice Approximation and its Evaluation Using Analytical Solutions

    NASA Astrophysics Data System (ADS)

    Gopalan, Giri; Hrafnkelsson, Birgir; Aðalgeirsdóttir, Guðfinna; Jarosch, Alexander H.; Pálsson, Finnur

    2018-03-01

    Bayesian hierarchical modeling can assist the study of glacial dynamics and ice flow properties. This approach will allow glaciologists to make fully probabilistic predictions for the thickness of a glacier at unobserved spatio-temporal coordinates, and it will also allow for the derivation of posterior probability distributions for key physical parameters such as ice viscosity and basal sliding. The goal of this paper is to develop a proof of concept for a Bayesian hierarchical model constructed, which uses exact analytical solutions for the shallow ice approximation (SIA) introduced by Bueler et al. (2005). A suite of test simulations utilizing these exact solutions suggests that this approach is able to adequately model numerical errors and produce useful physical parameter posterior distributions and predictions. A byproduct of the development of the Bayesian hierarchical model is the derivation of a novel finite difference method for solving the SIA partial differential equation (PDE). An additional novelty of this work is the correction of numerical errors induced through a numerical solution using a statistical model. This error correcting process models numerical errors that accumulate forward in time and spatial variation of numerical errors between the dome, interior, and margin of a glacier.

  8. Flood Frequency Curves - Use of information on the likelihood of extreme floods

    NASA Astrophysics Data System (ADS)

    Faber, B.

    2011-12-01

    Investment in the infrastructure that reduces flood risk for flood-prone communities must incorporate information on the magnitude and frequency of flooding in that area. Traditionally, that information has been a probability distribution of annual maximum streamflows developed from the historical gaged record at a stream site. Practice in the United States fits a Log-Pearson type3 distribution to the annual maximum flows of an unimpaired streamflow record, using the method of moments to estimate distribution parameters. The procedure makes the assumptions that annual peak streamflow events are (1) independent, (2) identically distributed, and (3) form a representative sample of the overall probability distribution. Each of these assumptions can be challenged. We rarely have enough data to form a representative sample, and therefore must compute and display the uncertainty in the estimated flood distribution. But, is there a wet/dry cycle that makes precipitation less than independent between successive years? Are the peak flows caused by different types of events from different statistical populations? How does the watershed or climate changing over time (non-stationarity) affect the probability distribution floods? Potential approaches to avoid these assumptions vary from estimating trend and shift and removing them from early data (and so forming a homogeneous data set), to methods that estimate statistical parameters that vary with time. A further issue in estimating a probability distribution of flood magnitude (the flood frequency curve) is whether a purely statistical approach can accurately capture the range and frequency of floods that are of interest. A meteorologically-based analysis produces "probable maximum precipitation" (PMP) and subsequently a "probable maximum flood" (PMF) that attempts to describe an upper bound on flood magnitude in a particular watershed. This analysis can help constrain the upper tail of the probability distribution, well beyond the range of gaged data or even historical or paleo-flood data, which can be very important in risk analyses performed for flood risk management and dam and levee safety studies.

  9. Exploring Sedimentary Basins with High Frequency Receiver Function: the Dublin Basin Case Study

    NASA Astrophysics Data System (ADS)

    Licciardi, A.; Piana Agostinetti, N.

    2015-12-01

    The Receiver Function (RF) method is a widely applied seismological tool for the imaging of crustal and lithospheric structures beneath a single seismic station with one to tens kilometers of vertical resolution. However, detailed information about the upper crust (0-10 km depth) can also be retrieved by increasing the frequency content of the analyzed RF data-set (with a vertical resolution lower than 0.5km). This information includes depth of velocity contrasts, S-wave velocities within layers, as well as presence and location of seismic anisotropy or dipping interfaces (e.g., induced by faulting) at depth. These observables provides valuable constraints on the structural settings and properties of sedimentary basins both for scientific and industrial applications. To test the RF capabilities for this high resolution application, six broadband seismic stations have been deployed across the southwestern margin of the Dublin Basin (DB), Ireland, whose geothermal potential has been investigated in the last few years. With an inter-station distance of about 1km, this closely spaced array has been designed to provide a clear picture of the structural transition between the margin and the inner portion of the basin. In this study, a Bayesian approach is used to retrieve the posterior probability distributions of S-wave velocity at depth beneath each seismic station. A multi-frequency RF data-set is analyzed and RF and curves of apparent velocity are jointly inverted to better constrain absolute velocity variations. A pseudo 2D section is built to observe the lateral changes in elastic properties across the margin of the basin with a focus in the shallow portion of the crust. Moreover, by means of the harmonic decomposition technique, the azimuthal variations in the RF data-set are isolated and interpreted in terms of anisotropy and dipping interfaces associated with the major fault system in the area. These results are compared with the available information from previous seismic active surveys in the area, including boreholes data.

  10. Improved Determination of the Myelin Water Fraction in Human Brain using Magnetic Resonance Imaging through Bayesian Analysis of mcDESPOT

    PubMed Central

    Bouhrara, Mustapha; Spencer, Richard G.

    2015-01-01

    Myelin water fraction (MWF) mapping with magnetic resonance imaging has led to the ability to directly observe myelination and demyelination in both the developing brain and in disease. Multicomponent driven equilibrium single pulse observation of T1 and T2 (mcDESPOT) has been proposed as a rapid approach for multicomponent relaxometry and has been applied to map MWF in human brain. However, even for the simplest two-pool signal model consisting of MWF and non-myelin-associated water, the dimensionality of the parameter space for obtaining MWF estimates remains high. This renders parameter estimation difficult, especially at low-to-moderate signal-to-noise ratios (SNR), due to the presence of local minima and the flatness of the fit residual energy surface used for parameter determination using conventional nonlinear least squares (NLLS)-based algorithms. In this study, we introduce three Bayesian approaches for analysis of the mcDESPOT signal model to determine MWF. Given the high dimensional nature of mcDESPOT signal model, and, thereby, the high dimensional marginalizations over nuisance parameters needed to derive the posterior probability distribution of MWF parameter, the introduced Bayesian analyses use different approaches to reduce the dimensionality of the parameter space. The first approach uses normalization by average signal amplitude, and assumes that noise can be accurately estimated from signal-free regions of the image. The second approach likewise uses average amplitude normalization, but incorporates a full treatment of noise as an unknown variable through marginalization. The third approach does not use amplitude normalization and incorporates marginalization over both noise and signal amplitude. Through extensive Monte Carlo numerical simulations and analysis of in-vivo human brain datasets exhibiting a range of SNR and spatial resolution, we demonstrated the markedly improved accuracy and precision in the estimation of MWF using these Bayesian methods as compared to the stochastic region contraction (SRC) implementation of NLLS. PMID:26499810

  11. The McDermitt Caldera, NV-OR, USA: Geologic mapping, volcanology, mineralization, and high precision 40Ar/39Ar dating of early Yellowstone hotspot magmatism

    NASA Astrophysics Data System (ADS)

    Henry, C. D.; Castor, S. B.; Starkel, W. A.; Ellis, B. S.; Wolff, J. A.; Heizler, M. T.; McIntosh, W. C.

    2012-12-01

    The irregularly keyhole-shaped, 40x30 to 22 km, McDermitt caldera formed at 16.35±0.03 Ma (n=4; Fish Canyon sanidine = 28.201 Ma) during eruption of a zoned, aphyric, mildly peralkaline rhyolite to abundantly anorthoclase-phyric, metaluminous dacite (McDermitt Tuff, MDT). Intracaldera MDT is locally strongly rheomorphic and, where MDT and caldera floor are well-exposed along the western margin, contains abundant megabreccia but is a maximum of ~450 m thick. If this thickness is representative of the caldera, intracaldera MDT has a volume of ~400 km3. Outflow MDT is currently known up to 13 km south of the caldera but only 3 km north of the caldera. Maximum outflow thickness is ~100 m, and outflow volume is probably no more than about 10% that of intracaldera MDT. The thickness and volume relations indicate collapse began very early during eruption, and most tuff ponded within the caldera. Outflow is strongly rheomorphic where draped over paleotopography. Late, undated icelandite lavas and domes are probably residual magma from the caldera chamber. Resurgence is expressed as both a broad, symmetrical dome in the north part and a fault-bound uplift in the south part of the caldera. Mineralization associated with the caldera includes Zr-rich U deposits that are indistinguishable in age with the McDermitt Tuff, Hg, Au, Ga, and Li-rich intracaldera tuffaceous sediments. Although formed during probable regional extension, the caldera is flat-lying and cut only at its west and east margins by much younger, high-angle normal faults. The caldera formed in an area of highly diverse Cenozoic volcanic rocks. The oldest are 39 and 46 Ma metaluminous dacite lavas along the northwest margin. Coarsely plagioclase-phyric to aphyric Steens Basalt lavas crop out around the west, northwest, and northeast margin. An anorthoclase-phyric, low-Si rhyolite lava (16.69±0.02 Ma) that is interbedded with probable Steens lavas northeast of the caldera and a biotite rhyolite lava dome (16.62±0.02 Ma) in the west floor of the caldera are the oldest middle Miocene silicic rocks near the caldera. Other pre-caldera rocks are a mix of variably peralkaline, distal ignimbrites; biotite rhyolite domes and lavas; and variably peralkaline rhyolite lavas that were emplaced between about 16.50 and 16.36 Ma. Silicic volcanism around the McDermitt caldera is some of the oldest of the Yellowstone hotspot track, but two known calderas in NW Nevada and unidentified sources of distal ignimbrites near McDermitt are older than the McDermitt caldera. Initial hotspot silicic volcanism occurred over a large area across NW Nevada, SE Oregon, and SW Idaho.

  12. Stylized facts in internal rates of return on stock index and its derivative transactions

    NASA Astrophysics Data System (ADS)

    Pichl, Lukáš; Kaizoji, Taisei; Yamano, Takuya

    2007-08-01

    Universal features in stock markets and their derivative markets are studied by means of probability distributions in internal rates of return on buy and sell transaction pairs. Unlike the stylized facts in normalized log returns, the probability distributions for such single asset encounters incorporate the time factor by means of the internal rate of return, defined as the continuous compound interest. Resulting stylized facts are shown in the probability distributions derived from the daily series of TOPIX, S & P 500 and FTSE 100 index close values. The application of the above analysis to minute-tick data of NIKKEI 225 and its futures market, respectively, reveals an interesting difference in the behavior of the two probability distributions, in case a threshold on the minimal duration of the long position is imposed. It is therefore suggested that the probability distributions of the internal rates of return could be used for causality mining between the underlying and derivative stock markets. The highly specific discrete spectrum, which results from noise trader strategies as opposed to the smooth distributions observed for fundamentalist strategies in single encounter transactions may be useful in deducing the type of investment strategy from trading revenues of small portfolio investors.

  13. Probabilistic Reasoning for Robustness in Automated Planning

    NASA Technical Reports Server (NTRS)

    Schaffer, Steven; Clement, Bradley; Chien, Steve

    2007-01-01

    A general-purpose computer program for planning the actions of a spacecraft or other complex system has been augmented by incorporating a subprogram that reasons about uncertainties in such continuous variables as times taken to perform tasks and amounts of resources to be consumed. This subprogram computes parametric probability distributions for time and resource variables on the basis of user-supplied models of actions and resources that they consume. The current system accepts bounded Gaussian distributions over action duration and resource use. The distributions are then combined during planning to determine the net probability distribution of each resource at any time point. In addition to a full combinatoric approach, several approximations for arriving at these combined distributions are available, including maximum-likelihood and pessimistic algorithms. Each such probability distribution can then be integrated to obtain a probability that execution of the plan under consideration would violate any constraints on the resource. The key idea is to use these probabilities of conflict to score potential plans and drive a search toward planning low-risk actions. An output plan provides a balance between the user s specified averseness to risk and other measures of optimality.

  14. Survival Predictions of Ceramic Crowns Using Statistical Fracture Mechanics

    PubMed Central

    Nasrin, S.; Katsube, N.; Seghi, R.R.; Rokhlin, S.I.

    2017-01-01

    This work establishes a survival probability methodology for interface-initiated fatigue failures of monolithic ceramic crowns under simulated masticatory loading. A complete 3-dimensional (3D) finite element analysis model of a minimally reduced molar crown was developed using commercially available hardware and software. Estimates of material surface flaw distributions and fatigue parameters for 3 reinforced glass-ceramics (fluormica [FM], leucite [LR], and lithium disilicate [LD]) and a dense sintered yttrium-stabilized zirconia (YZ) were obtained from the literature and incorporated into the model. Utilizing the proposed fracture mechanics–based model, crown survival probability as a function of loading cycles was obtained from simulations performed on the 4 ceramic materials utilizing identical crown geometries and loading conditions. The weaker ceramic materials (FM and LR) resulted in lower survival rates than the more recently developed higher-strength ceramic materials (LD and YZ). The simulated 10-y survival rate of crowns fabricated from YZ was only slightly better than those fabricated from LD. In addition, 2 of the model crown systems (FM and LD) were expanded to determine regional-dependent failure probabilities. This analysis predicted that the LD-based crowns were more likely to fail from fractures initiating from margin areas, whereas the FM-based crowns showed a slightly higher probability of failure from fractures initiating from the occlusal table below the contact areas. These 2 predicted fracture initiation locations have some agreement with reported fractographic analyses of failed crowns. In this model, we considered the maximum tensile stress tangential to the interfacial surface, as opposed to the more universally reported maximum principal stress, because it more directly impacts crack propagation. While the accuracy of these predictions needs to be experimentally verified, the model can provide a fundamental understanding of the importance that pre-existing flaws at the intaglio surface have on fatigue failures. PMID:28107637

  15. Seasonal evolution of the Arctic marginal ice zone and its power-law obeying floe size distribution

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Stern, H. L., III; Schweiger, A. J. B.; Steele, M.; Hwang, P. B.

    2017-12-01

    A thickness, floe size, and enthalpy distribution (TFED) sea ice model, implemented numerically into the Pan-arctic Ice-Ocean Modeling and Assimilation System (PIOMAS), is used to investigate the seasonal evolution of the Arctic marginal ice zone (MIZ) and its floe size distribution. The TFED sea ice model, by coupling the Zhang et al. [2015] sea ice floe size distribution (FSD) theory with the Thorndike et al. [1975] ice thickness distribution (ITD) theory, simulates 12-category FSD and ITD explicitly and jointly. A range of ice thickness and floe size observations were used for model calibration and validation. The model creates FSDs that generally obey a power law or upper truncated power law, as observed by satellites and aerial surveys. In this study, we will examine the role of ice fragmentation and lateral melting in altering FSDs in the Arctic MIZ. We will also investigate how changes in FSD impact the seasonal evolution of the MIZ by modifying the thermodynamic processes.

  16. Prospect for the development of salted egg agro industry: an analysis on marketing distribution aspect

    NASA Astrophysics Data System (ADS)

    Sumekar, W.; Al-Baarri, A. N.; Kurnianto, E.

    2018-01-01

    Marketing distribution is an important of the strategy in business development in agroindustries. The aim of the research was to introduce marketing (distribution pattern, margin and marketing efficiency) at the salted egg agro industries in Brebes Regency. Survey method had been conducted on 52 salted egg agro industries which had active PIRT certificate. The data collection was conducted by means of interview and observation. Descriptive analysis was used to determine the marketing distribution of salted eggs. Marketing efficiency was obtained by calculating marketing margin and farmer share. The results show that the salted egg agro industries implemented two marketing distribution patterns; direct marketing pattern (consumer→producers) and indirect marketing pattern (producer→retailer→consumer). The number of the salted egg agro industries which apply indirect marketing pattern is 57.69%. The implementation of direct and indirect marketing patterns was classified as efficient according to the farmer’s share values of 87.13% and 78.21%. It can be recommended the direct marketing.

  17. Mathematical Model to estimate the wind power using four-parameter Burr distribution

    NASA Astrophysics Data System (ADS)

    Liu, Sanming; Wang, Zhijie; Pan, Zhaoxu

    2018-03-01

    When the real probability of wind speed in the same position needs to be described, the four-parameter Burr distribution is more suitable than other distributions. This paper introduces its important properties and characteristics. Also, the application of the four-parameter Burr distribution in wind speed prediction is discussed, and the expression of probability distribution of output power of wind turbine is deduced.

  18. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  19. Imaging the Buried Chicxulub Crater with Gravity Gradients and Cenotes

    NASA Astrophysics Data System (ADS)

    Hildebrand, A. R.; Pilkington, M.; Halpenny, J. F.; Ortiz-Aleman, C.; Chavez, R. E.; Urrutia-Fucugauchi, J.; Connors, M.; Graniel-Castro, E.; Camara-Zi, A.; Vasquez, J.

    1995-09-01

    Differing interpretations of the Bouguer gravity anomaly over the Chicxulub crater, Yucatan Peninsula, Mexico, have yielded diameter estimates of 170 to 320 km. Knowing the crater's size is necessary to quantify the lethal perturbations to the Cretaceous environment associated with its formation. The crater's size (and internal structure) is revealed by the horizontal gradient of the Bouguer gravity anomaly over the structure, and by mapping the karst features of the Yucatan region. To improve our resolution of the crater's gravity signature we collected additional gravity measurements primarily along radial profiles, but also to fill in previously unsurveyed areas. Horizontal gradient analysis of Bouguer gravity data objectively highlights the lateral density contrasts of the impact lithologies and suppresses regional anomalies which may obscure the gravity signature of the Chicxulub crater lithologies. This gradient technique yields a striking circular structure with at least 6 concentric gradient features between 25 and 85 km radius. These features are most distinct in the southwest probably because of denser sampling of the gravity field. Our detailed profiles detected an additional feature and steeper gradients (up to 5 mGal/km) than the original survey. We interpret the outer four gradient maxima to represent concentric faults in the crater's zone of slumping as is also revealed by seismic reflection data. The inner two probably represent the margin of the central uplift and the peak ring and or collapsed transient cavity. Radial gradients in the SW quadrant over the inferred ~40 km-diameter central uplift (4) may represent structural "puckering" as revealed at eroded terrestrial craters. Gradient features related to regional gravity highs and lows are visible outside the crater, but no concentric gradient features are apparent at distances > 90 km radius. The marginal gradient features may be modelled by slump faults as observed in large complex craters on the other terrestrial planets. A modeled fault of 1.5 km displacement (slightly slumped block exterior and impact breccia interior) reproduces the steepest gradient feature. This model is incompatible with models that place these gradient features inside the collapsed transient cavity. Locations of the karst features of the northern Yucatan region were digitized from 1:50,000 topographic maps, which show most but not all the water-filled sinkholes (locally known as cenotes). A prominent ring of cenotes is visible over the crater that is spatially correlated to the outer steep gravity gradient feature. The mapped cenotes constitute an unbiased sampling of the region's karst surface features of >50 m diameter. The gradient maximum and the cenote ring both meander with amplitudes of up to 2 km. The wiggles in the gradient feature and the cenote distribution probably correspond to the "scalloping" observed at the headwall of terraces in large complex craters. A second partial cenote ring exterior to the southwest side of the main ring corresponds to a less-prominent gravity gradient feature. No concentric structure is observable in the distribution of karst features at radii >90 km. The cenote ring is bounded by the outer peripheral steep gradient feature and must be related to it; the slump faults must have been reactivated sufficiently to create fracturing in the overlying and much younger sediment. Long term subsidence, as found at other terrestrial craters is a possible mechanism for the reactivation. Such long term subsidence may be caused by differential compaction or thermal relaxation. Elevations acquired during gravity surveys show that the cenote ring also corresponds to a topographic low along some of its length that probably reflects preferential erosion.

  20. Type Region of the Ione Formation (Eocene), Central California: Stratigraphy, Paleogeography, and Relation to Auriferous Gravels

    USGS Publications Warehouse

    Creely, Scott; Force, Eric R.

    2007-01-01

    The middle Eocene Ione Formation extends over 200 miles (320 km) along the western edge of the Sierra Nevada. Our study was concentrated in the type region, 30 miles (48 km) along strike. There a bedrock ridge forms the seaward western side of the Ione depositional tract, defining a subbasin margin. The eastern limit of the type Ione is locally defined by high-angle faults. Ione sediments were spread over Upper Mesozoic metamorphic and plutonic bedrock, fed by gold-bearing streams dissecting the western slope of the ancestral Sierra Nevada. By middle Eocene time, a tropical or subtropical climate prevailed, leading to deep chemical weathering (including laterization) and a distinctively mature mineral assemblage was fed to and generated within Ione deposits. The Ione is noted for its abundant kaolinitic clay, some of it coarsely crystalline; the clay is present as both detrital grains and authigenic cement. Quartz is abundant, mostly as angular grains. Heavy mineral fractions are dominated by altered ilmenite and zircon. Distribution of feldspar is irregular, both stratigraphically and areally. Non-marine facies are most voluminous, and include conglomerates, especially at the base and along the eastern margins of the formation where they pass into Sierran auriferous gravels. Clays, grading into lignites, and gritty sands are also common facies. Both braided and meandering fluvial facies have been recognized. Shallow marine waters flooded the basin probably twice. Tongues of sediment exhibiting a variety of estuarine to marine indicators are underlain and overlain by fluvial deposits. Marine body fossils are found at only a few localities, but burrows identified as Ophiomorpha and cf. Thalassinoides are abundant in many places. Other clues to marginal marine deposition are the occurrence of glauconite in one bed, typical relations of lagoonal to beach (locally heavy-mineral-rich) lithofacies, closed-basin three-dimensional morphology of basinal facies, and high sulfur content of some marginal coals. The Ione has been said to be deltaic; however the two transgressional-regressional cycles we propose imply that only the regressional parts were deltaic. At other times, much of the type Ione would better be termed an intertidal estuary. Because the lower marine sequence was deposited against a paleobasin margin on the west, deltaic morphology was constrained, but apparently progradation was from north to south despite drainage into the basin from the east. Relations to the south are unclear due to the Stockton arch. The eastern margin of the type-Ione basin, and to some extent even its marine facies, are poorly constrained. A surface on Sierran bedrock to the east may have been stripped of some Ione basinal facies, leaving only coeval entrenched fluvial channel deposits.

Top