Sample records for underlying probability distribution

  1. Vertical changes in the probability distribution of downward irradiance within the near-surface ocean under sunny conditions

    NASA Astrophysics Data System (ADS)

    Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw

    2011-07-01

    Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.

  2. Moment and maximum likelihood estimators for Weibull distributions under length- and area-biased sampling

    Treesearch

    Jeffrey H. Gove

    2003-01-01

    Many of the most popular sampling schemes used in forestry are probability proportional to size methods. These methods are also referred to as size biased because sampling is actually from a weighted form of the underlying population distribution. Length- and area-biased sampling are special cases of size-biased sampling where the probability weighting comes from a...

  3. Nonadditive entropies yield probability distributions with biases not warranted by the data.

    PubMed

    Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A

    2013-11-01

    Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.

  4. The beta distribution: A statistical model for world cloud cover

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1973-01-01

    Much work has been performed in developing empirical global cloud cover models. This investigation was made to determine an underlying theoretical statistical distribution to represent worldwide cloud cover. The beta distribution with probability density function is given to represent the variability of this random variable. It is shown that the beta distribution possesses the versatile statistical characteristics necessary to assume the wide variety of shapes exhibited by cloud cover. A total of 160 representative empirical cloud cover distributions were investigated and the conclusion was reached that this study provides sufficient statical evidence to accept the beta probability distribution as the underlying model for world cloud cover.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.

    In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less

  6. How to model a negligible probability under the WTO sanitary and phytosanitary agreement?

    PubMed

    Powell, Mark R

    2013-06-01

    Since the 1997 EC--Hormones decision, World Trade Organization (WTO) Dispute Settlement Panels have wrestled with the question of what constitutes a negligible risk under the Sanitary and Phytosanitary Agreement. More recently, the 2010 WTO Australia--Apples Panel focused considerable attention on the appropriate quantitative model for a negligible probability in a risk assessment. The 2006 Australian Import Risk Analysis for Apples from New Zealand translated narrative probability statements into quantitative ranges. The uncertainty about a "negligible" probability was characterized as a uniform distribution with a minimum value of zero and a maximum value of 10(-6) . The Australia - Apples Panel found that the use of this distribution would tend to overestimate the likelihood of "negligible" events and indicated that a triangular distribution with a most probable value of zero and a maximum value of 10⁻⁶ would correct the bias. The Panel observed that the midpoint of the uniform distribution is 5 × 10⁻⁷ but did not consider that the triangular distribution has an expected value of 3.3 × 10⁻⁷. Therefore, if this triangular distribution is the appropriate correction, the magnitude of the bias found by the Panel appears modest. The Panel's detailed critique of the Australian risk assessment, and the conclusions of the WTO Appellate Body about the materiality of flaws found by the Panel, may have important implications for the standard of review for risk assessments under the WTO SPS Agreement. © 2012 Society for Risk Analysis.

  7. Ubiquity of Benford's law and emergence of the reciprocal distribution

    DOE PAGES

    Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.

    2016-04-07

    In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less

  8. Work probability distribution and tossing a biased coin

    NASA Astrophysics Data System (ADS)

    Saha, Arnab; Bhattacharjee, Jayanta K.; Chakraborty, Sagar

    2011-01-01

    We show that the rare events present in dissipated work that enters Jarzynski equality, when mapped appropriately to the phenomenon of large deviations found in a biased coin toss, are enough to yield a quantitative work probability distribution for the Jarzynski equality. This allows us to propose a recipe for constructing work probability distribution independent of the details of any relevant system. The underlying framework, developed herein, is expected to be of use in modeling other physical phenomena where rare events play an important role.

  9. Unified nano-mechanics based probabilistic theory of quasibrittle and brittle structures: II. Fatigue crack growth, lifetime and scaling

    NASA Astrophysics Data System (ADS)

    Le, Jia-Liang; Bažant, Zdeněk P.

    2011-07-01

    This paper extends the theoretical framework presented in the preceding Part I to the lifetime distribution of quasibrittle structures failing at the fracture of one representative volume element under constant amplitude fatigue. The probability distribution of the critical stress amplitude is derived for a given number of cycles and a given minimum-to-maximum stress ratio. The physical mechanism underlying the Paris law for fatigue crack growth is explained under certain plausible assumptions about the damage accumulation in the cyclic fracture process zone at the tip of subcritical crack. This law is then used to relate the probability distribution of critical stress amplitude to the probability distribution of fatigue lifetime. The theory naturally yields a power-law relation for the stress-life curve (S-N curve), which agrees with Basquin's law. Furthermore, the theory indicates that, for quasibrittle structures, the S-N curve must be size dependent. Finally, physical explanation is provided to the experimentally observed systematic deviations of lifetime histograms of various ceramics and bones from the Weibull distribution, and their close fits by the present theory are demonstrated.

  10. Hybrid Approaches and Industrial Applications of Pattern Recognition,

    DTIC Science & Technology

    1980-10-01

    emphasized that the probability distribution in (9) is correct only under the assumption that P( wIx ) is known exactly. In practice this assumption will...sufficient precision. The alternative would be to take the probability distribution of estimates of P( wix ) into account in the analysis. However, from the

  11. q-Gaussian distributions and multiplicative stochastic processes for analysis of multiple financial time series

    NASA Astrophysics Data System (ADS)

    Sato, Aki-Hiro

    2010-12-01

    This study considers q-Gaussian distributions and stochastic differential equations with both multiplicative and additive noises. In the M-dimensional case a q-Gaussian distribution can be theoretically derived as a stationary probability distribution of the multiplicative stochastic differential equation with both mutually independent multiplicative and additive noises. By using the proposed stochastic differential equation a method to evaluate a default probability under a given risk buffer is proposed.

  12. Modeling highway travel time distribution with conditional probability models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira Neto, Francisco Moraes; Chin, Shih-Miao; Hwang, Ho-Ling

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program providesmore » a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).« less

  13. Probability distributions for Markov chain based quantum walks

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  14. Power-law tail probabilities of drainage areas in river basins

    USGS Publications Warehouse

    Veitzer, S.A.; Troutman, B.M.; Gupta, V.K.

    2003-01-01

    The significance of power-law tail probabilities of drainage areas in river basins was discussed. The convergence to a power law was not observed for all underlying distributions, but for a large class of statistical distributions with specific limiting properties. The article also discussed about the scaling properties of topologic and geometric network properties in river basins.

  15. Statistical Characterization of the Mechanical Parameters of Intact Rock Under Triaxial Compression: An Experimental Proof of the Jinping Marble

    NASA Astrophysics Data System (ADS)

    Jiang, Quan; Zhong, Shan; Cui, Jie; Feng, Xia-Ting; Song, Leibo

    2016-12-01

    We investigated the statistical characteristics and probability distribution of the mechanical parameters of natural rock using triaxial compression tests. Twenty cores of Jinping marble were tested under each different levels of confining stress (i.e., 5, 10, 20, 30, and 40 MPa). From these full stress-strain data, we summarized the numerical characteristics and determined the probability distribution form of several important mechanical parameters, including deformational parameters, characteristic strength, characteristic strains, and failure angle. The statistical proofs relating to the mechanical parameters of rock presented new information about the marble's probabilistic distribution characteristics. The normal and log-normal distributions were appropriate for describing random strengths of rock; the coefficients of variation of the peak strengths had no relationship to the confining stress; the only acceptable random distribution for both Young's elastic modulus and Poisson's ratio was the log-normal function; and the cohesive strength had a different probability distribution pattern than the frictional angle. The triaxial tests and statistical analysis also provided experimental evidence for deciding the minimum reliable number of experimental sample and for picking appropriate parameter distributions to use in reliability calculations for rock engineering.

  16. Newsvendor problem under complete uncertainty: a case of innovative products.

    PubMed

    Gaspars-Wieloch, Helena

    2017-01-01

    The paper presents a new scenario-based decision rule for the classical version of the newsvendor problem (NP) under complete uncertainty (i.e. uncertainty with unknown probabilities). So far, NP has been analyzed under uncertainty with known probabilities or under uncertainty with partial information (probabilities known incompletely). The novel approach is designed for the sale of new, innovative products, where it is quite complicated to define probabilities or even probability-like quantities, because there are no data available for forecasting the upcoming demand via statistical analysis. The new procedure described in the contribution is based on a hybrid of Hurwicz and Bayes decision rules. It takes into account the decision maker's attitude towards risk (measured by coefficients of optimism and pessimism) and the dispersion (asymmetry, range, frequency of extremes values) of payoffs connected with particular order quantities. It does not require any information about the probability distribution.

  17. On the inequivalence of the CH and CHSH inequalities due to finite statistics

    NASA Astrophysics Data System (ADS)

    Renou, M. O.; Rosset, D.; Martin, A.; Gisin, N.

    2017-06-01

    Different variants of a Bell inequality, such as CHSH and CH, are known to be equivalent when evaluated on nonsignaling outcome probability distributions. However, in experimental setups, the outcome probability distributions are estimated using a finite number of samples. Therefore the nonsignaling conditions are only approximately satisfied and the robustness of the violation depends on the chosen inequality variant. We explain that phenomenon using the decomposition of the space of outcome probability distributions under the action of the symmetry group of the scenario, and propose a method to optimize the statistical robustness of a Bell inequality. In the process, we describe the finite group composed of relabeling of parties, measurement settings and outcomes, and identify correspondences between the irreducible representations of this group and properties of outcome probability distributions such as normalization, signaling or having uniform marginals.

  18. Extended Poisson process modelling and analysis of grouped binary data.

    PubMed

    Faddy, Malcolm J; Smith, David M

    2012-05-01

    A simple extension of the Poisson process results in binomially distributed counts of events in a time interval. A further extension generalises this to probability distributions under- or over-dispersed relative to the binomial distribution. Substantial levels of under-dispersion are possible with this modelling, but only modest levels of over-dispersion - up to Poisson-like variation. Although simple analytical expressions for the moments of these probability distributions are not available, approximate expressions for the mean and variance are derived, and used to re-parameterise the models. The modelling is applied in the analysis of two published data sets, one showing under-dispersion and the other over-dispersion. More appropriate assessment of the precision of estimated parameters and reliable model checking diagnostics follow from this more general modelling of these data sets. © 2012 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Derivation of low flow frequency distributions under human activities and its implications

    NASA Astrophysics Data System (ADS)

    Gao, Shida; Liu, Pan; Pan, Zhengke; Ming, Bo; Guo, Shenglian; Xiong, Lihua

    2017-06-01

    Low flow, refers to a minimum streamflow in dry seasons, is crucial to water supply, agricultural irrigation and navigation. Human activities, such as groundwater pumping, influence low flow severely. In order to derive the low flow frequency distribution functions under human activities, this study incorporates groundwater pumping and return flow as variables in the recession process. Steps are as follows: (1) the original low flow without human activities is assumed to follow a Pearson type three distribution, (2) the probability distribution of climatic dry spell periods is derived based on a base flow recession model, (3) the base flow recession model is updated under human activities, and (4) the low flow distribution under human activities is obtained based on the derived probability distribution of dry spell periods and the updated base flow recession model. Linear and nonlinear reservoir models are used to describe the base flow recession, respectively. The Wudinghe basin is chosen for the case study, with daily streamflow observations during 1958-2000. Results show that human activities change the location parameter of the low flow frequency curve for the linear reservoir model, while alter the frequency distribution function for the nonlinear one. It is indicated that alter the parameters of the low flow frequency distribution is not always feasible to tackle the changing environment.

  20. Force Density Function Relationships in 2-D Granular Media

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert C.; Metzger, Philip T.; Kilts, Kelly N.

    2004-01-01

    An integral transform relationship is developed to convert between two important probability density functions (distributions) used in the study of contact forces in granular physics. Developing this transform has now made it possible to compare and relate various theoretical approaches with one another and with the experimental data despite the fact that one may predict the Cartesian probability density and another the force magnitude probability density. Also, the transforms identify which functional forms are relevant to describe the probability density observed in nature, and so the modified Bessel function of the second kind has been identified as the relevant form for the Cartesian probability density corresponding to exponential forms in the force magnitude distribution. Furthermore, it is shown that this transform pair supplies a sufficient mathematical framework to describe the evolution of the force magnitude distribution under shearing. Apart from the choice of several coefficients, whose evolution of values must be explained in the physics, this framework successfully reproduces the features of the distribution that are taken to be an indicator of jamming and unjamming in a granular packing. Key words. Granular Physics, Probability Density Functions, Fourier Transforms

  1. Most recent common ancestor probability distributions in gene genealogies under selection.

    PubMed

    Slade, P F

    2000-12-01

    A computational study is made of the conditional probability distribution for the allelic type of the most recent common ancestor in genealogies of samples of n genes drawn from a population under selection, given the initial sample configuration. Comparisons with the corresponding unconditional cases are presented. Such unconditional distributions differ from samples drawn from the unique stationary distribution of population allelic frequencies, known as Wright's formula, and are quantified. Biallelic haploid and diploid models are considered. A simplified structure for the ancestral selection graph of S. M. Krone and C. Neuhauser (1997, Theor. Popul. Biol. 51, 210-237) is enhanced further, reducing the effective branching rate in the graph. This improves efficiency of such a nonneutral analogue of the coalescent for use with computational likelihood-inference techniques.

  2. Stylized facts in internal rates of return on stock index and its derivative transactions

    NASA Astrophysics Data System (ADS)

    Pichl, Lukáš; Kaizoji, Taisei; Yamano, Takuya

    2007-08-01

    Universal features in stock markets and their derivative markets are studied by means of probability distributions in internal rates of return on buy and sell transaction pairs. Unlike the stylized facts in normalized log returns, the probability distributions for such single asset encounters incorporate the time factor by means of the internal rate of return, defined as the continuous compound interest. Resulting stylized facts are shown in the probability distributions derived from the daily series of TOPIX, S & P 500 and FTSE 100 index close values. The application of the above analysis to minute-tick data of NIKKEI 225 and its futures market, respectively, reveals an interesting difference in the behavior of the two probability distributions, in case a threshold on the minimal duration of the long position is imposed. It is therefore suggested that the probability distributions of the internal rates of return could be used for causality mining between the underlying and derivative stock markets. The highly specific discrete spectrum, which results from noise trader strategies as opposed to the smooth distributions observed for fundamentalist strategies in single encounter transactions may be useful in deducing the type of investment strategy from trading revenues of small portfolio investors.

  3. Probabilistic Reasoning for Robustness in Automated Planning

    NASA Technical Reports Server (NTRS)

    Schaffer, Steven; Clement, Bradley; Chien, Steve

    2007-01-01

    A general-purpose computer program for planning the actions of a spacecraft or other complex system has been augmented by incorporating a subprogram that reasons about uncertainties in such continuous variables as times taken to perform tasks and amounts of resources to be consumed. This subprogram computes parametric probability distributions for time and resource variables on the basis of user-supplied models of actions and resources that they consume. The current system accepts bounded Gaussian distributions over action duration and resource use. The distributions are then combined during planning to determine the net probability distribution of each resource at any time point. In addition to a full combinatoric approach, several approximations for arriving at these combined distributions are available, including maximum-likelihood and pessimistic algorithms. Each such probability distribution can then be integrated to obtain a probability that execution of the plan under consideration would violate any constraints on the resource. The key idea is to use these probabilities of conflict to score potential plans and drive a search toward planning low-risk actions. An output plan provides a balance between the user s specified averseness to risk and other measures of optimality.

  4. Quasi-probabilities in conditioned quantum measurement and a geometric/statistical interpretation of Aharonov's weak value

    NASA Astrophysics Data System (ADS)

    Lee, Jaeha; Tsutsui, Izumi

    2017-05-01

    We show that the joint behavior of an arbitrary pair of (generally noncommuting) quantum observables can be described by quasi-probabilities, which are an extended version of the standard probabilities used for describing the outcome of measurement for a single observable. The physical situations that require these quasi-probabilities arise when one considers quantum measurement of an observable conditioned by some other variable, with the notable example being the weak measurement employed to obtain Aharonov's weak value. Specifically, we present a general prescription for the construction of quasi-joint probability (QJP) distributions associated with a given combination of observables. These QJP distributions are introduced in two complementary approaches: one from a bottom-up, strictly operational construction realized by examining the mathematical framework of the conditioned measurement scheme, and the other from a top-down viewpoint realized by applying the results of the spectral theorem for normal operators and their Fourier transforms. It is then revealed that, for a pair of simultaneously measurable observables, the QJP distribution reduces to the unique standard joint probability distribution of the pair, whereas for a noncommuting pair there exists an inherent indefiniteness in the choice of such QJP distributions, admitting a multitude of candidates that may equally be used for describing the joint behavior of the pair. In the course of our argument, we find that the QJP distributions furnish the space of operators in the underlying Hilbert space with their characteristic geometric structures such that the orthogonal projections and inner products of observables can be given statistical interpretations as, respectively, “conditionings” and “correlations”. The weak value Aw for an observable A is then given a geometric/statistical interpretation as either the orthogonal projection of A onto the subspace generated by another observable B, or equivalently, as the conditioning of A given B with respect to the QJP distribution under consideration.

  5. Investigation of Bose-Einstein Condensates in q-Deformed Potentials with First Order Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Nutku, Ferhat; Aydıner, Ekrem

    2018-02-01

    The Gross-Pitaevskii equation, which is the governor equation of Bose-Einstein condensates, is solved by first order perturbation expansion under various q-deformed potentials. Stationary probability distributions reveal one and two soliton behavior depending on the type of the q-deformed potential. Additionally a spatial shift of the probability distribution is found for the dark soliton solution, when the q parameter is changed.

  6. Discriminating between Light- and Heavy-Tailed Distributions with Limit Theorem.

    PubMed

    Burnecki, Krzysztof; Wylomanska, Agnieszka; Chechkin, Aleksei

    2015-01-01

    In this paper we propose an algorithm to distinguish between light- and heavy-tailed probability laws underlying random datasets. The idea of the algorithm, which is visual and easy to implement, is to check whether the underlying law belongs to the domain of attraction of the Gaussian or non-Gaussian stable distribution by examining its rate of convergence. The method allows to discriminate between stable and various non-stable distributions. The test allows to differentiate between distributions, which appear the same according to standard Kolmogorov-Smirnov test. In particular, it helps to distinguish between stable and Student's t probability laws as well as between the stable and tempered stable, the cases which are considered in the literature as very cumbersome. Finally, we illustrate the procedure on plasma data to identify cases with so-called L-H transition.

  7. Discriminating between Light- and Heavy-Tailed Distributions with Limit Theorem

    PubMed Central

    Chechkin, Aleksei

    2015-01-01

    In this paper we propose an algorithm to distinguish between light- and heavy-tailed probability laws underlying random datasets. The idea of the algorithm, which is visual and easy to implement, is to check whether the underlying law belongs to the domain of attraction of the Gaussian or non-Gaussian stable distribution by examining its rate of convergence. The method allows to discriminate between stable and various non-stable distributions. The test allows to differentiate between distributions, which appear the same according to standard Kolmogorov–Smirnov test. In particular, it helps to distinguish between stable and Student’s t probability laws as well as between the stable and tempered stable, the cases which are considered in the literature as very cumbersome. Finally, we illustrate the procedure on plasma data to identify cases with so-called L-H transition. PMID:26698863

  8. Using the weighted area under the net benefit curve for decision curve analysis.

    PubMed

    Talluri, Rajesh; Shete, Sanjay

    2016-07-18

    Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.

  9. [Establishment of the mathematic model of total quantum statistical moment standard similarity for application to medical theoretical research].

    PubMed

    He, Fu-yuan; Deng, Kai-wen; Huang, Sheng; Liu, Wen-long; Shi, Ji-lian

    2013-09-01

    The paper aims to elucidate and establish a new mathematic model: the total quantum statistical moment standard similarity (TQSMSS) on the base of the original total quantum statistical moment model and to illustrate the application of the model to medical theoretical research. The model was established combined with the statistical moment principle and the normal distribution probability density function properties, then validated and illustrated by the pharmacokinetics of three ingredients in Buyanghuanwu decoction and of three data analytical method for them, and by analysis of chromatographic fingerprint for various extracts with different solubility parameter solvents dissolving the Buyanghanwu-decoction extract. The established model consists of four mainly parameters: (1) total quantum statistical moment similarity as ST, an overlapped area by two normal distribution probability density curves in conversion of the two TQSM parameters; (2) total variability as DT, a confidence limit of standard normal accumulation probability which is equal to the absolute difference value between the two normal accumulation probabilities within integration of their curve nodical; (3) total variable probability as 1-Ss, standard normal distribution probability within interval of D(T); (4) total variable probability (1-beta)alpha and (5) stable confident probability beta(1-alpha): the correct probability to make positive and negative conclusions under confident coefficient alpha. With the model, we had analyzed the TQSMS similarities of pharmacokinetics of three ingredients in Buyanghuanwu decoction and of three data analytical methods for them were at range of 0.3852-0.9875 that illuminated different pharmacokinetic behaviors of each other; and the TQSMS similarities (ST) of chromatographic fingerprint for various extracts with different solubility parameter solvents dissolving Buyanghuanwu-decoction-extract were at range of 0.6842-0.999 2 that showed different constituents with various solvent extracts. The TQSMSS can characterize the sample similarity, by which we can quantitate the correct probability with the test of power under to make positive and negative conclusions no matter the samples come from same population under confident coefficient a or not, by which we can realize an analysis at both macroscopic and microcosmic levels, as an important similar analytical method for medical theoretical research.

  10. Robust approaches to quantification of margin and uncertainty for sparse data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less

  11. Characterising RNA secondary structure space using information entropy

    PubMed Central

    2013-01-01

    Comparative methods for RNA secondary structure prediction use evolutionary information from RNA alignments to increase prediction accuracy. The model is often described in terms of stochastic context-free grammars (SCFGs), which generate a probability distribution over secondary structures. It is, however, unclear how this probability distribution changes as a function of the input alignment. As prediction programs typically only return a single secondary structure, better characterisation of the underlying probability space of RNA secondary structures is of great interest. In this work, we show how to efficiently compute the information entropy of the probability distribution over RNA secondary structures produced for RNA alignments by a phylo-SCFG, and implement it for the PPfold model. We also discuss interpretations and applications of this quantity, including how it can clarify reasons for low prediction reliability scores. PPfold and its source code are available from http://birc.au.dk/software/ppfold/. PMID:23368905

  12. Streamflow distribution maps for the Cannon River drainage basin, southeast Minnesota, and the St. Louis River drainage basin, northeast Minnesota

    USGS Publications Warehouse

    Smith, Erik A.; Sanocki, Chris A.; Lorenz, David L.; Jacobsen, Katrin E.

    2017-12-27

    Streamflow distribution maps for the Cannon River and St. Louis River drainage basins were developed by the U.S. Geological Survey, in cooperation with the Legislative-Citizen Commission on Minnesota Resources, to illustrate relative and cumulative streamflow distributions. The Cannon River was selected to provide baseline data to assess the effects of potential surficial sand mining, and the St. Louis River was selected to determine the effects of ongoing Mesabi Iron Range mining. Each drainage basin (Cannon, St. Louis) was subdivided into nested drainage basins: the Cannon River was subdivided into 152 nested drainage basins, and the St. Louis River was subdivided into 353 nested drainage basins. For each smaller drainage basin, the estimated volumes of groundwater discharge (as base flow) and surface runoff flowing into all surface-water features were displayed under the following conditions: (1) extreme low-flow conditions, comparable to an exceedance-probability quantile of 0.95; (2) low-flow conditions, comparable to an exceedance-probability quantile of 0.90; (3) a median condition, comparable to an exceedance-probability quantile of 0.50; and (4) a high-flow condition, comparable to an exceedance-probability quantile of 0.02.Streamflow distribution maps were developed using flow-duration curve exceedance-probability quantiles in conjunction with Soil-Water-Balance model outputs; both the flow-duration curve and Soil-Water-Balance models were built upon previously published U.S. Geological Survey reports. The selected streamflow distribution maps provide a proactive water management tool for State cooperators by illustrating flow rates during a range of hydraulic conditions. Furthermore, after the nested drainage basins are highlighted in terms of surface-water flows, the streamflows can be evaluated in the context of meeting specific ecological flows under different flow regimes and potentially assist with decisions regarding groundwater and surface-water appropriations. Presented streamflow distribution maps are foundational work intended to support the development of additional streamflow distribution maps that include statistical constraints on the selected flow conditions.

  13. The effect of microscopic friction and size distributions on conditional probability distributions in soft particle packings

    NASA Astrophysics Data System (ADS)

    Saitoh, Kuniyasu; Magnanimo, Vanessa; Luding, Stefan

    2017-10-01

    Employing two-dimensional molecular dynamics (MD) simulations of soft particles, we study their non-affine responses to quasi-static isotropic compression where the effects of microscopic friction between the particles in contact and particle size distributions are examined. To quantify complicated restructuring of force-chain networks under isotropic compression, we introduce the conditional probability distributions (CPDs) of particle overlaps such that a master equation for distribution of overlaps in the soft particle packings can be constructed. From our MD simulations, we observe that the CPDs are well described by q-Gaussian distributions, where we find that the correlation for the evolution of particle overlaps is suppressed by microscopic friction, while it significantly increases with the increase of poly-dispersity.

  14. Bayesian soft X-ray tomography using non-stationary Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Li, Dong; Svensson, J.; Thomsen, H.; Medina, F.; Werner, A.; Wolf, R.

    2013-08-01

    In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.

  15. Bayesian soft X-ray tomography using non-stationary Gaussian Processes.

    PubMed

    Li, Dong; Svensson, J; Thomsen, H; Medina, F; Werner, A; Wolf, R

    2013-08-01

    In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.

  16. Universal characteristics of fractal fluctuations in prime number distribution

    NASA Astrophysics Data System (ADS)

    Selvam, A. M.

    2014-11-01

    The frequency of occurrence of prime numbers at unit number spacing intervals exhibits self-similar fractal fluctuations concomitant with inverse power law form for power spectrum generic to dynamical systems in nature such as fluid flows, stock market fluctuations and population dynamics. The physics of long-range correlations exhibited by fractals is not yet identified. A recently developed general systems theory visualizes the eddy continuum underlying fractals to result from the growth of large eddies as the integrated mean of enclosed small scale eddies, thereby generating a hierarchy of eddy circulations or an inter-connected network with associated long-range correlations. The model predictions are as follows: (1) The probability distribution and power spectrum of fractals follow the same inverse power law which is a function of the golden mean. The predicted inverse power law distribution is very close to the statistical normal distribution for fluctuations within two standard deviations from the mean of the distribution. (2) Fractals signify quantum-like chaos since variance spectrum represents probability density distribution, a characteristic of quantum systems such as electron or photon. (3) Fractal fluctuations of frequency distribution of prime numbers signify spontaneous organization of underlying continuum number field into the ordered pattern of the quasiperiodic Penrose tiling pattern. The model predictions are in agreement with the probability distributions and power spectra for different sets of frequency of occurrence of prime numbers at unit number interval for successive 1000 numbers. Prime numbers in the first 10 million numbers were used for the study.

  17. Testing option pricing with the Edgeworth expansion

    NASA Astrophysics Data System (ADS)

    Balieiro Filho, Ruy Gabriel; Rosenfeld, Rogerio

    2004-12-01

    There is a well-developed framework, the Black-Scholes theory, for the pricing of contracts based on the future prices of certain assets, called options. This theory assumes that the probability distribution of the returns of the underlying asset is a Gaussian distribution. However, it is observed in the market that this hypothesis is flawed, leading to the introduction of a fudge factor, the so-called volatility smile. Therefore, it would be interesting to explore extensions of the Black-Scholes theory to non-Gaussian distributions. In this paper, we provide an explicit formula for the price of an option when the distributions of the returns of the underlying asset is parametrized by an Edgeworth expansion, which allows for the introduction of higher independent moments of the probability distribution, namely skewness and kurtosis. We test our formula with options in the Brazilian and American markets, showing that the volatility smile can be reduced. We also check whether our approach leads to more efficient hedging strategies of these instruments.

  18. Probability Analysis of the Wave-Slamming Pressure Values of the Horizontal Deck with Elastic Support

    NASA Astrophysics Data System (ADS)

    Zuo, Weiguang; Liu, Ming; Fan, Tianhui; Wang, Pengtao

    2018-06-01

    This paper presents the probability distribution of the slamming pressure from an experimental study of regular wave slamming on an elastically supported horizontal deck. The time series of the slamming pressure during the wave impact were first obtained through statistical analyses on experimental data. The exceeding probability distribution of the maximum slamming pressure peak and distribution parameters were analyzed, and the results show that the exceeding probability distribution of the maximum slamming pressure peak accords with the three-parameter Weibull distribution. Furthermore, the range and relationships of the distribution parameters were studied. The sum of the location parameter D and the scale parameter L was approximately equal to 1.0, and the exceeding probability was more than 36.79% when the random peak was equal to the sample average during the wave impact. The variation of the distribution parameters and slamming pressure under different model conditions were comprehensively presented, and the parameter values of the Weibull distribution of wave-slamming pressure peaks were different due to different test models. The parameter values were found to decrease due to the increased stiffness of the elastic support. The damage criterion of the structure model caused by the wave impact was initially discussed, and the structure model was destroyed when the average slamming time was greater than a certain value during the duration of the wave impact. The conclusions of the experimental study were then described.

  19. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    PubMed

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  20. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  1. Individual heterogeneity and identifiability in capture-recapture models

    USGS Publications Warehouse

    Link, W.A.

    2004-01-01

    Individual heterogeneity in detection probabilities is a far more serious problem for capture-recapture modeling than has previously been recognized. In this note, I illustrate that population size is not an identifiable parameter under the general closed population mark-recapture model Mh. The problem of identifiability is obvious if the population includes individuals with pi = 0, but persists even when it is assumed that individual detection probabilities are bounded away from zero. Identifiability may be attained within parametric families of distributions for pi, but not among parametric families of distributions. Consequently, in the presence of individual heterogeneity in detection probability, capture-recapture analysis is strongly model dependent.

  2. We'll Meet Again: Revealing Distributional and Temporal Patterns of Social Contact

    PubMed Central

    Pachur, Thorsten; Schooler, Lael J.; Stevens, Jeffrey R.

    2014-01-01

    What are the dynamics and regularities underlying social contact, and how can contact with the people in one's social network be predicted? In order to characterize distributional and temporal patterns underlying contact probability, we asked 40 participants to keep a diary of their social contacts for 100 consecutive days. Using a memory framework previously used to study environmental regularities, we predicted that the probability of future contact would follow in systematic ways from the frequency, recency, and spacing of previous contact. The distribution of contact probability across the members of a person's social network was highly skewed, following an exponential function. As predicted, it emerged that future contact scaled linearly with frequency of past contact, proportionally to a power function with recency of past contact, and differentially according to the spacing of past contact. These relations emerged across different contact media and irrespective of whether the participant initiated or received contact. We discuss how the identification of these regularities might inspire more realistic analyses of behavior in social networks (e.g., attitude formation, cooperation). PMID:24475073

  3. Computer program determines exact two-sided tolerance limits for normal distributions

    NASA Technical Reports Server (NTRS)

    Friedman, H. A.; Webb, S. R.

    1968-01-01

    Computer program determines by numerical integration the exact statistical two-sided tolerance limits, when the proportion between the limits is at least a specified number. The program is limited to situations in which the underlying probability distribution for the population sampled is the normal distribution with unknown mean and variance.

  4. Low Probability of Intercept Waveforms via Intersymbol Dither Performance Under Multiple Conditions

    DTIC Science & Technology

    2009-03-01

    United States Air Force, Department of Defense, or the United States Government . AFIT/GE/ENG/09-23 Low Probability of Intercept Waveforms via...21 D random variable governing the distribution of dither values 21 p (ct) D (t) probability density function of the...potential performance loss of a non-cooperative receiver compared to a cooperative receiver designed to account for ISI and multipath. 1.3 Thesis

  5. Low Probability of Intercept Waveforms via Intersymbol Dither Performance Under Multipath Conditions

    DTIC Science & Technology

    2009-03-01

    United States Air Force, Department of Defense, or the United States Government . AFIT/GE/ENG/09-23 Low Probability of Intercept Waveforms via...21 D random variable governing the distribution of dither values 21 p (ct) D (t) probability density function of the...potential performance loss of a non-cooperative receiver compared to a cooperative receiver designed to account for ISI and multipath. 1.3 Thesis

  6. A Simple Method for Estimating Informative Node Age Priors for the Fossil Calibration of Molecular Divergence Time Analyses

    PubMed Central

    Nowak, Michael D.; Smith, Andrew B.; Simpson, Carl; Zwickl, Derrick J.

    2013-01-01

    Molecular divergence time analyses often rely on the age of fossil lineages to calibrate node age estimates. Most divergence time analyses are now performed in a Bayesian framework, where fossil calibrations are incorporated as parametric prior probabilities on node ages. It is widely accepted that an ideal parameterization of such node age prior probabilities should be based on a comprehensive analysis of the fossil record of the clade of interest, but there is currently no generally applicable approach for calculating such informative priors. We provide here a simple and easily implemented method that employs fossil data to estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade, which can be used to fit an informative parametric prior probability distribution on a node age. Specifically, our method uses the extant diversity and the stratigraphic distribution of fossil lineages confidently assigned to a clade to fit a branching model of lineage diversification. Conditioning this on a simple model of fossil preservation, we estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade. The likelihood surface of missing history can then be translated into a parametric prior probability distribution on the age of the clade of interest. We show that the method performs well with simulated fossil distribution data, but that the likelihood surface of missing history can at times be too complex for the distribution-fitting algorithm employed by our software tool. An empirical example of the application of our method is performed to estimate echinoid node ages. A simulation-based sensitivity analysis using the echinoid data set shows that node age prior distributions estimated under poor preservation rates are significantly less informative than those estimated under high preservation rates. PMID:23755303

  7. Steady-state distributions of probability fluxes on complex networks

    NASA Astrophysics Data System (ADS)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  8. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    NASA Astrophysics Data System (ADS)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further assure that the entropy-based joint rainfall-runoff distribution are satisfactorily derived. Overall, the study shows the Shannon entropy theory can be satisfactorily applied to model the dependence between rainfall and runoff. The study also shows that the entropy-based joint distribution is an appropriate approach to capture the dependence structure that cannot be captured by the convenient bivariate joint distributions. Joint Rainfall-Runoff Entropy Based PDF, and Corresponding Marginal PDF and Histogram for W12 Watershed The K-S Test Result and RMSE on Univariate Distributions Derived from the Maximum Entropy Based Joint Probability Distribution;

  9. The Generalized Quantum Episodic Memory Model.

    PubMed

    Trueblood, Jennifer S; Hemmer, Pernille

    2017-11-01

    Recent evidence suggests that experienced events are often mapped to too many episodic states, including those that are logically or experimentally incompatible with one another. For example, episodic over-distribution patterns show that the probability of accepting an item under different mutually exclusive conditions violates the disjunction rule. A related example, called subadditivity, occurs when the probability of accepting an item under mutually exclusive and exhaustive instruction conditions sums to a number >1. Both the over-distribution effect and subadditivity have been widely observed in item and source-memory paradigms. These phenomena are difficult to explain using standard memory frameworks, such as signal-detection theory. A dual-trace model called the over-distribution (OD) model (Brainerd & Reyna, 2008) can explain the episodic over-distribution effect, but not subadditivity. Our goal is to develop a model that can explain both effects. In this paper, we propose the Generalized Quantum Episodic Memory (GQEM) model, which extends the Quantum Episodic Memory (QEM) model developed by Brainerd, Wang, and Reyna (2013). We test GQEM by comparing it to the OD model using data from a novel item-memory experiment and a previously published source-memory experiment (Kellen, Singmann, & Klauer, 2014) examining the over-distribution effect. Using the best-fit parameters from the over-distribution experiments, we conclude by showing that the GQEM model can also account for subadditivity. Overall these results add to a growing body of evidence suggesting that quantum probability theory is a valuable tool in modeling recognition memory. Copyright © 2016 Cognitive Science Society, Inc.

  10. A Computer Program to Evaluate Timber Production Investments Under Uncertainty

    Treesearch

    Dennis L. Schweitzer

    1968-01-01

    A computer program has been written in Fortran IV to calculate probability distributions of present worths of investments in timber production. Inputs can include both point and probabilistic estimates of future costs, prices, and yields. Distributions of rates of return can also be constructed.

  11. On the motion of classical three-body system with consideration of quantum fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gevorkyan, A. S., E-mail: g-ashot@sci.am

    2017-03-15

    We obtained the systemof stochastic differential equations which describes the classicalmotion of the three-body system under influence of quantum fluctuations. Using SDEs, for the joint probability distribution of the total momentum of bodies system were obtained the partial differential equation of the second order. It is shown, that the equation for the probability distribution is solved jointly by classical equations, which in turn are responsible for the topological peculiarities of tubes of quantum currents, transitions between asymptotic channels and, respectively for arising of quantum chaos.

  12. Correlation signatures of wet soils and snows. [algorithm development and computer programming

    NASA Technical Reports Server (NTRS)

    Phillips, M. R.

    1972-01-01

    Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.

  13. Maritime Search and Rescue via Multiple Coordinated UAS

    DTIC Science & Technology

    2016-01-01

    partitioning method uses the underlying probability distribution assumptions to place that probability near the geometric center of the partitions. There...During partitioning the known locations are accommodated, but the unaccounted for objects are placed into geometrically unfavorable conditions. The...Zeitlin, A.D.: UAS Sence and Avoid Develop- ment - the Challenges of Technology, Standards, and Certification. Aerospace Sciences Meeting including

  14. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    PubMed

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  15. Voronoi cell patterns: Theoretical model and applications

    NASA Astrophysics Data System (ADS)

    González, Diego Luis; Einstein, T. L.

    2011-11-01

    We use a simple fragmentation model to describe the statistical behavior of the Voronoi cell patterns generated by a homogeneous and isotropic set of points in 1D and in 2D. In particular, we are interested in the distribution of sizes of these Voronoi cells. Our model is completely defined by two probability distributions in 1D and again in 2D, the probability to add a new point inside an existing cell and the probability that this new point is at a particular position relative to the preexisting point inside this cell. In 1D the first distribution depends on a single parameter while the second distribution is defined through a fragmentation kernel; in 2D both distributions depend on a single parameter. The fragmentation kernel and the control parameters are closely related to the physical properties of the specific system under study. We use our model to describe the Voronoi cell patterns of several systems. Specifically, we study the island nucleation with irreversible attachment, the 1D car-parking problem, the formation of second-level administrative divisions, and the pattern formed by the Paris Métro stations.

  16. Voronoi Cell Patterns: theoretical model and application to submonolayer growth

    NASA Astrophysics Data System (ADS)

    González, Diego Luis; Einstein, T. L.

    2012-02-01

    We use a simple fragmentation model to describe the statistical behavior of the Voronoi cell patterns generated by a homogeneous and isotropic set of points in 1D and in 2D. In particular, we are interested in the distribution of sizes of these Voronoi cells. Our model is completely defined by two probability distributions in 1D and again in 2D, the probability to add a new point inside an existing cell and the probability that this new point is at a particular position relative to the preexisting point inside this cell. In 1D the first distribution depends on a single parameter while the second distribution is defined through a fragmentation kernel; in 2D both distributions depend on a single parameter. The fragmentation kernel and the control parameters are closely related to the physical properties of the specific system under study. We apply our model to describe the Voronoi cell patterns of island nucleation for critical island sizes i=0,1,2,3. Experimental results for the Voronoi cells of InAs/GaAs quantum dots are also described by our model.

  17. Financial derivative pricing under probability operator via Esscher transfomation

    NASA Astrophysics Data System (ADS)

    Achi, Godswill U.

    2014-10-01

    The problem of pricing contingent claims has been extensively studied for non-Gaussian models, and in particular, Black- Scholes formula has been derived for the NIG asset pricing model. This approach was first developed in insurance pricing9 where the original distortion function was defined in terms of the normal distribution. This approach was later studied6 where they compared the standard Black-Scholes contingent pricing and distortion based contingent pricing. So, in this paper, we aim at using distortion operators by Cauchy distribution under a simple transformation to price contingent claim. We also show that we can recuperate the Black-Sholes formula using the distribution. Similarly, in a financial market in which the asset price represented by a stochastic differential equation with respect to Brownian Motion, the price mechanism based on characteristic Esscher measure can generate approximate arbitrage free financial derivative prices. The price representation derived involves probability Esscher measure and Esscher Martingale measure and under a new complex valued measure φ (u) evaluated at the characteristic exponents φx(u) of Xt we recuperate the Black-Scholes formula for financial derivative prices.

  18. Estimating probable flaw distributions in PWR steam generator tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorman, J.A.; Turner, A.P.L.

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regardingmore » uncertainties and assumptions in the data and analyses.« less

  19. On the issues of probability distribution of GPS carrier phase observations

    NASA Astrophysics Data System (ADS)

    Luo, X.; Mayer, M.; Heck, B.

    2009-04-01

    In common practice the observables related to Global Positioning System (GPS) are assumed to follow a Gauss-Laplace normal distribution. Actually, full knowledge of the observables' distribution is not required for parameter estimation by means of the least-squares algorithm based on the functional relation between observations and unknown parameters as well as the associated variance-covariance matrix. However, the probability distribution of GPS observations plays a key role in procedures for quality control (e.g. outlier and cycle slips detection, ambiguity resolution) and in reliability-related assessments of the estimation results. Under non-ideal observation conditions with respect to the factors impacting GPS data quality, for example multipath effects and atmospheric delays, the validity of the normal distribution postulate of GPS observations is in doubt. This paper presents a detailed analysis of the distribution properties of GPS carrier phase observations using double difference residuals. For this purpose 1-Hz observation data from the permanent SAPOS

  20. A Search Model for Imperfectly Detected Targets

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert

    2012-01-01

    Under the assumptions that 1) the search region can be divided up into N non-overlapping sub-regions that are searched sequentially, 2) the probability of detection is unity if a sub-region is selected, and 3) no information is available to guide the search, there are two extreme case models. The search can be done perfectly, leading to a uniform distribution over the number of searches required, or the search can be done with no memory, leading to a geometric distribution for the number of searches required with a success probability of 1/N. If the probability of detection P is less than unity, but the search is done otherwise perfectly, the searcher will have to search the N regions repeatedly until detection occurs. The number of searches is thus the sum two random variables. One is N times the number of full searches (a geometric distribution with success probability P) and the other is the uniform distribution over the integers 1 to N. The first three moments of this distribution were computed, giving the mean, standard deviation, and the kurtosis of the distribution as a function of the two parameters. The model was fit to the data presented last year (Ahumada, Billington, & Kaiwi, 2 required to find a single pixel target on a simulated horizon. The model gave a good fit to the three moments for all three observers.

  1. Gravity dual for a model of perception

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakayama, Yu, E-mail: nakayama@berkeley.edu

    2011-01-15

    One of the salient features of human perception is its invariance under dilatation in addition to the Euclidean group, but its non-invariance under special conformal transformation. We investigate a holographic approach to the information processing in image discrimination with this feature. We claim that a strongly coupled analogue of the statistical model proposed by Bialek and Zee can be holographically realized in scale invariant but non-conformal Euclidean geometries. We identify the Bayesian probability distribution of our generalized Bialek-Zee model with the GKPW partition function of the dual gravitational system. We provide a concrete example of the geometric configuration based onmore » a vector condensation model coupled with the Euclidean Einstein-Hilbert action. From the proposed geometry, we study sample correlation functions to compute the Bayesian probability distribution.« less

  2. Rapidly assessing the probability of exceptionally high natural hazard losses

    NASA Astrophysics Data System (ADS)

    Gollini, Isabella; Rougier, Jonathan

    2014-05-01

    One of the objectives in catastrophe modeling is to assess the probability distribution of losses for a specified period, such as a year. From the point of view of an insurance company, the whole of the loss distribution is interesting, and valuable in determining insurance premiums. But the shape of the righthand tail is critical, because it impinges on the solvency of the company. A simple measure of the risk of insolvency is the probability that the annual loss will exceed the company's current operating capital. Imposing an upper limit on this probability is one of the objectives of the EU Solvency II directive. If a probabilistic model is supplied for the loss process, then this tail probability can be computed, either directly, or by simulation. This can be a lengthy calculation for complex losses. Given the inevitably subjective nature of quantifying loss distributions, computational resources might be better used in a sensitivity analysis. This requires either a quick approximation to the tail probability or an upper bound on the probability, ideally a tight one. We present several different bounds, all of which can be computed nearly instantly from a very general event loss table. We provide a numerical illustration, and discuss the conditions under which the bound is tight. Although we consider the perspective of insurance and reinsurance companies, exactly the same issues concern the risk manager, who is typically very sensitive to large losses.

  3. Bayesian alternative to the ISO-GUM's use of the Welch Satterthwaite formula

    NASA Astrophysics Data System (ADS)

    Kacker, Raghu N.

    2006-02-01

    In certain disciplines, uncertainty is traditionally expressed as an interval about an estimate for the value of the measurand. Development of such uncertainty intervals with a stated coverage probability based on the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement (GUM) requires a description of the probability distribution for the value of the measurand. The ISO-GUM propagates the estimates and their associated standard uncertainties for various input quantities through a linear approximation of the measurement equation to determine an estimate and its associated standard uncertainty for the value of the measurand. This procedure does not yield a probability distribution for the value of the measurand. The ISO-GUM suggests that under certain conditions motivated by the central limit theorem the distribution for the value of the measurand may be approximated by a scaled-and-shifted t-distribution with effective degrees of freedom obtained from the Welch-Satterthwaite (W-S) formula. The approximate t-distribution may then be used to develop an uncertainty interval with a stated coverage probability for the value of the measurand. We propose an approximate normal distribution based on a Bayesian uncertainty as an alternative to the t-distribution based on the W-S formula. A benefit of the approximate normal distribution based on a Bayesian uncertainty is that it greatly simplifies the expression of uncertainty by eliminating altogether the need for calculating effective degrees of freedom from the W-S formula. In the special case where the measurand is the difference between two means, each evaluated from statistical analyses of independent normally distributed measurements with unknown and possibly unequal variances, the probability distribution for the value of the measurand is known to be a Behrens-Fisher distribution. We compare the performance of the approximate normal distribution based on a Bayesian uncertainty and the approximate t-distribution based on the W-S formula with respect to the Behrens-Fisher distribution. The approximate normal distribution is simpler and better in this case. A thorough investigation of the relative performance of the two approximate distributions would require comparison for a range of measurement equations by numerical methods.

  4. Shape of growth-rate distribution determines the type of Non-Gibrat’s Property

    NASA Astrophysics Data System (ADS)

    Ishikawa, Atushi; Fujimoto, Shouji; Mizuno, Takayuki

    2011-11-01

    In this study, the authors examine exhaustive business data on Japanese firms, which cover nearly all companies in the mid- and large-scale ranges in terms of firm size, to reach several key findings on profits/sales distribution and business growth trends. Here, profits denote net profits. First, detailed balance is observed not only in profits data but also in sales data. Furthermore, the growth-rate distribution of sales has wider tails than the linear growth-rate distribution of profits in log-log scale. On the one hand, in the mid-scale range of profits, the probability of positive growth decreases and the probability of negative growth increases symmetrically as the initial value increases. This is called Non-Gibrat’s First Property. On the other hand, in the mid-scale range of sales, the probability of positive growth decreases as the initial value increases, while the probability of negative growth hardly changes. This is called Non-Gibrat’s Second Property. Under detailed balance, Non-Gibrat’s First and Second Properties are analytically derived from the linear and quadratic growth-rate distributions in log-log scale, respectively. In both cases, the log-normal distribution is inferred from Non-Gibrat’s Properties and detailed balance. These analytic results are verified by empirical data. Consequently, this clarifies the notion that the difference in shapes between growth-rate distributions of sales and profits is closely related to the difference between the two Non-Gibrat’s Properties in the mid-scale range.

  5. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    PubMed

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  6. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  7. The ranking probability approach and its usage in design and analysis of large-scale studies.

    PubMed

    Kuo, Chia-Ling; Zaykin, Dmitri

    2013-01-01

    In experiments with many statistical tests there is need to balance type I and type II error rates while taking multiplicity into account. In the traditional approach, the nominal [Formula: see text]-level such as 0.05 is adjusted by the number of tests, [Formula: see text], i.e., as 0.05/[Formula: see text]. Assuming that some proportion of tests represent "true signals", that is, originate from a scenario where the null hypothesis is false, power depends on the number of true signals and the respective distribution of effect sizes. One way to define power is for it to be the probability of making at least one correct rejection at the assumed [Formula: see text]-level. We advocate an alternative way of establishing how "well-powered" a study is. In our approach, useful for studies with multiple tests, the ranking probability [Formula: see text] is controlled, defined as the probability of making at least [Formula: see text] correct rejections while rejecting hypotheses with [Formula: see text] smallest P-values. The two approaches are statistically related. Probability that the smallest P-value is a true signal (i.e., [Formula: see text]) is equal to the power at the level [Formula: see text], to an very good excellent approximation. Ranking probabilities are also related to the false discovery rate and to the Bayesian posterior probability of the null hypothesis. We study properties of our approach when the effect size distribution is replaced for convenience by a single "typical" value taken to be the mean of the underlying distribution. We conclude that its performance is often satisfactory under this simplification; however, substantial imprecision is to be expected when [Formula: see text] is very large and [Formula: see text] is small. Precision is largely restored when three values with the respective abundances are used instead of a single typical effect size value.

  8. Effects of ignition location models on the burn patterns of simulated wildfires

    USGS Publications Warehouse

    Bar-Massada, A.; Syphard, A.D.; Hawbaker, T.J.; Stewart, S.I.; Radeloff, V.C.

    2011-01-01

    Fire simulation studies that use models such as FARSITE often assume that ignition locations are distributed randomly, because spatially explicit information about actual ignition locations are difficult to obtain. However, many studies show that the spatial distribution of ignition locations, whether human-caused or natural, is non-random. Thus, predictions from fire simulations based on random ignitions may be unrealistic. However, the extent to which the assumption of ignition location affects the predictions of fire simulation models has never been systematically explored. Our goal was to assess the difference in fire simulations that are based on random versus non-random ignition location patterns. We conducted four sets of 6000 FARSITE simulations for the Santa Monica Mountains in California to quantify the influence of random and non-random ignition locations and normal and extreme weather conditions on fire size distributions and spatial patterns of burn probability. Under extreme weather conditions, fires were significantly larger for non-random ignitions compared to random ignitions (mean area of 344.5 ha and 230.1 ha, respectively), but burn probability maps were highly correlated (r = 0.83). Under normal weather, random ignitions produced significantly larger fires than non-random ignitions (17.5 ha and 13.3 ha, respectively), and the spatial correlations between burn probability maps were not high (r = 0.54), though the difference in the average burn probability was small. The results of the study suggest that the location of ignitions used in fire simulation models may substantially influence the spatial predictions of fire spread patterns. However, the spatial bias introduced by using a random ignition location model may be minimized if the fire simulations are conducted under extreme weather conditions when fire spread is greatest. ?? 2010 Elsevier Ltd.

  9. A Deterministic Annealing Approach to Clustering AIRS Data

    NASA Technical Reports Server (NTRS)

    Guillaume, Alexandre; Braverman, Amy; Ruzmaikin, Alexander

    2012-01-01

    We will examine the validity of means and standard deviations as a basis for climate data products. We will explore the conditions under which these two simple statistics are inadequate summaries of the underlying empirical probability distributions by contrasting them with a nonparametric, method called Deterministic Annealing technique

  10. Neural response to reward anticipation under risk is nonlinear in probabilities.

    PubMed

    Hsu, Ming; Krajbich, Ian; Zhao, Chen; Camerer, Colin F

    2009-02-18

    A widely observed phenomenon in decision making under risk is the apparent overweighting of unlikely events and the underweighting of nearly certain events. This violates standard assumptions in expected utility theory, which requires that expected utility be linear (objective) in probabilities. Models such as prospect theory have relaxed this assumption and introduced the notion of a "probability weighting function," which captures the key properties found in experimental data. This study reports functional magnetic resonance imaging (fMRI) data that neural response to expected reward is nonlinear in probabilities. Specifically, we found that activity in the striatum during valuation of monetary gambles are nonlinear in probabilities in the pattern predicted by prospect theory, suggesting that probability distortion is reflected at the level of the reward encoding process. The degree of nonlinearity reflected in individual subjects' decisions is also correlated with striatal activity across subjects. Our results shed light on the neural mechanisms of reward processing, and have implications for future neuroscientific studies of decision making involving extreme tails of the distribution, where probability weighting provides an explanation for commonly observed behavioral anomalies.

  11. Probability distribution of haplotype frequencies under the two-locus Wright-Fisher model by diffusion approximation.

    PubMed

    Boitard, Simon; Loisel, Patrice

    2007-05-01

    The probability distribution of haplotype frequencies in a population, and the way it is influenced by genetical forces such as recombination, selection, random drift ...is a question of fundamental interest in population genetics. For large populations, the distribution of haplotype frequencies for two linked loci under the classical Wright-Fisher model is almost impossible to compute because of numerical reasons. However the Wright-Fisher process can in such cases be approximated by a diffusion process and the transition density can then be deduced from the Kolmogorov equations. As no exact solution has been found for these equations, we developed a numerical method based on finite differences to solve them. It applies to transient states and models including selection or mutations. We show by several tests that this method is accurate for computing the conditional joint density of haplotype frequencies given that no haplotype has been lost. We also prove that it is far less time consuming than other methods such as Monte Carlo simulations.

  12. A novel method for correcting scanline-observational bias of discontinuity orientation

    PubMed Central

    Huang, Lei; Tang, Huiming; Tan, Qinwen; Wang, Dingjian; Wang, Liangqing; Ez Eldin, Mutasim A. M.; Li, Changdong; Wu, Qiong

    2016-01-01

    Scanline observation is known to introduce an angular bias into the probability distribution of orientation in three-dimensional space. In this paper, numerical solutions expressing the functional relationship between the scanline-observational distribution (in one-dimensional space) and the inherent distribution (in three-dimensional space) are derived using probability theory and calculus under the independence hypothesis of dip direction and dip angle. Based on these solutions, a novel method for obtaining the inherent distribution (also for correcting the bias) is proposed, an approach which includes two procedures: 1) Correcting the cumulative probabilities of orientation according to the solutions, and 2) Determining the distribution of the corrected orientations using approximation methods such as the one-sample Kolmogorov-Smirnov test. The inherent distribution corrected by the proposed method can be used for discrete fracture network (DFN) modelling, which is applied to such areas as rockmass stability evaluation, rockmass permeability analysis, rockmass quality calculation and other related fields. To maximize the correction capacity of the proposed method, the observed sample size is suggested through effectiveness tests for different distribution types, dispersions and sample sizes. The performance of the proposed method and the comparison of its correction capacity with existing methods are illustrated with two case studies. PMID:26961249

  13. Delineating Hydrofacies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Xuehang; Chen, Xingyuan; Ye, Ming

    2015-07-01

    This study develops a new framework of facies-based data assimilation for characterizing spatial distribution of hydrofacies and estimating their associated hydraulic properties. This framework couples ensemble data assimilation with transition probability-based geostatistical model via a parameterization based on a level set function. The nature of ensemble data assimilation makes the framework efficient and flexible to be integrated with various types of observation data. The transition probability-based geostatistical model keeps the updated hydrofacies distributions under geological constrains. The framework is illustrated by using a two-dimensional synthetic study that estimates hydrofacies spatial distribution and permeability in each hydrofacies from transient head data.more » Our results show that the proposed framework can characterize hydrofacies distribution and associated permeability with adequate accuracy even with limited direct measurements of hydrofacies. Our study provides a promising starting point for hydrofacies delineation in complex real problems.« less

  14. Performance of mixed RF/FSO systems in exponentiated Weibull distributed channels

    NASA Astrophysics Data System (ADS)

    Zhao, Jing; Zhao, Shang-Hong; Zhao, Wei-Hu; Liu, Yun; Li, Xuan

    2017-12-01

    This paper presented the performances of asymmetric mixed radio frequency (RF)/free-space optical (FSO) system with the amplify-and-forward relaying scheme. The RF channel undergoes Nakagami- m channel, and the Exponentiated Weibull distribution is adopted for the FSO component. The mathematical formulas for cumulative distribution function (CDF), probability density function (PDF) and moment generating function (MGF) of equivalent signal-to-noise ratio (SNR) are achieved. According to the end-to-end statistical characteristics, the new analytical expressions of outage probability are obtained. Under various modulation techniques, we derive the average bit-error-rate (BER) based on the Meijer's G function. The evaluation and simulation are provided for the system performance, and the aperture average effect is discussed as well.

  15. Directed Random Markets: Connectivity Determines Money

    NASA Astrophysics Data System (ADS)

    Martínez-Martínez, Ismael; López-Ruiz, Ricardo

    2013-12-01

    Boltzmann-Gibbs (BG) distribution arises as the statistical equilibrium probability distribution of money among the agents of a closed economic system where random and undirected exchanges are allowed. When considering a model with uniform savings in the exchanges, the final distribution is close to the gamma family. In this paper, we implement these exchange rules on networks and we find that these stationary probability distributions are robust and they are not affected by the topology of the underlying network. We introduce a new family of interactions: random but directed ones. In this case, it is found the topology to be determinant and the mean money per economic agent is related to the degree of the node representing the agent in the network. The relation between the mean money per economic agent and its degree is shown to be linear.

  16. Continuous-variable quantum key distribution in uniform fast-fading channels

    NASA Astrophysics Data System (ADS)

    Papanastasiou, Panagiotis; Weedbrook, Christian; Pirandola, Stefano

    2018-03-01

    We investigate the performance of several continuous-variable quantum key distribution protocols in the presence of uniform fading channels. These are lossy channels whose transmissivity changes according to a uniform probability distribution. We assume the worst-case scenario where an eavesdropper induces a fast-fading process, where she chooses the instantaneous transmissivity while the remote parties may only detect the mean statistical effect. We analyze coherent-state protocols in various configurations, including the one-way switching protocol in reverse reconciliation, the measurement-device-independent protocol in the symmetric configuration, and its extension to a three-party network. We show that, regardless of the advantage given to the eavesdropper (control of the fading), these protocols can still achieve high rates under realistic attacks, within reasonable values for the variance of the probability distribution associated with the fading process.

  17. High throughput nonparametric probability density estimation.

    PubMed

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  18. High throughput nonparametric probability density estimation

    PubMed Central

    Farmer, Jenny

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803

  19. Self-organizing behavior in a lattice model for co-evolution of virus and immune systems

    NASA Astrophysics Data System (ADS)

    Izmailian, N. Sh.; Papoyan, Vl. V.; Priezzhev, V. B.; Hu, Chin-Kun

    2007-04-01

    We propose a lattice model for the co-evolution of a virus population and an adaptive immune system. We show that, under some natural assumptions, both probability distribution of the virus population and the distribution of activity of the immune system tend during the evolution to a self-organized critical state.

  20. A superstatistical model of metastasis and cancer survival

    NASA Astrophysics Data System (ADS)

    Leon Chen, L.; Beck, Christian

    2008-05-01

    We introduce a superstatistical model for the progression statistics of malignant cancer cells. The metastatic cascade is modeled as a complex nonequilibrium system with several macroscopic pathways and inverse-chi-square distributed parameters of the underlying Poisson processes. The predictions of the model are in excellent agreement with observed survival-time probability distributions of breast cancer patients.

  1. The maximum entropy method of moments and Bayesian probability theory

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  2. On the distribution of interspecies correlation for Markov models of character evolution on Yule trees.

    PubMed

    Mulder, Willem H; Crawford, Forrest W

    2015-01-07

    Efforts to reconstruct phylogenetic trees and understand evolutionary processes depend fundamentally on stochastic models of speciation and mutation. The simplest continuous-time model for speciation in phylogenetic trees is the Yule process, in which new species are "born" from existing lineages at a constant rate. Recent work has illuminated some of the structural properties of Yule trees, but it remains mostly unknown how these properties affect sequence and trait patterns observed at the tips of the phylogenetic tree. Understanding the interplay between speciation and mutation under simple models of evolution is essential for deriving valid phylogenetic inference methods and gives insight into the optimal design of phylogenetic studies. In this work, we derive the probability distribution of interspecies covariance under Brownian motion and Ornstein-Uhlenbeck models of phenotypic change on a Yule tree. We compute the probability distribution of the number of mutations shared between two randomly chosen taxa in a Yule tree under discrete Markov mutation models. Our results suggest summary measures of phylogenetic information content, illuminate the correlation between site patterns in sequences or traits of related organisms, and provide heuristics for experimental design and reconstruction of phylogenetic trees. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Financial derivative pricing under probability operator via Esscher transfomation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achi, Godswill U., E-mail: achigods@yahoo.com

    2014-10-24

    The problem of pricing contingent claims has been extensively studied for non-Gaussian models, and in particular, Black- Scholes formula has been derived for the NIG asset pricing model. This approach was first developed in insurance pricing{sup 9} where the original distortion function was defined in terms of the normal distribution. This approach was later studied6 where they compared the standard Black-Scholes contingent pricing and distortion based contingent pricing. So, in this paper, we aim at using distortion operators by Cauchy distribution under a simple transformation to price contingent claim. We also show that we can recuperate the Black-Sholes formula usingmore » the distribution. Similarly, in a financial market in which the asset price represented by a stochastic differential equation with respect to Brownian Motion, the price mechanism based on characteristic Esscher measure can generate approximate arbitrage free financial derivative prices. The price representation derived involves probability Esscher measure and Esscher Martingale measure and under a new complex valued measure φ (u) evaluated at the characteristic exponents φ{sub x}(u) of X{sub t} we recuperate the Black-Scholes formula for financial derivative prices.« less

  4. A Statistical Framework for Microbial Source Attribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velsko, S P; Allen, J E; Cunningham, C T

    2009-04-28

    This report presents a general approach to inferring transmission and source relationships among microbial isolates from their genetic sequences. The outbreak transmission graph (also called the transmission tree or transmission network) is the fundamental structure which determines the statistical distributions relevant to source attribution. The nodes of this graph are infected individuals or aggregated sub-populations of individuals in which transmitted bacteria or viruses undergo clonal expansion, leading to a genetically heterogeneous population. Each edge of the graph represents a transmission event in which one or a small number of bacteria or virions infects another node thus increasing the size ofmore » the transmission network. Recombination and re-assortment events originate in nodes which are common to two distinct networks. In order to calculate the probability that one node was infected by another, given the observed genetic sequences of microbial isolates sampled from them, we require two fundamental probability distributions. The first is the probability of obtaining the observed mutational differences between two isolates given that they are separated by M steps in a transmission network. The second is the probability that two nodes sampled randomly from an outbreak transmission network are separated by M transmission events. We show how these distributions can be obtained from the genetic sequences of isolates obtained by sampling from past outbreaks combined with data from contact tracing studies. Realistic examples are drawn from the SARS outbreak of 2003, the FMDV outbreak in Great Britain in 2001, and HIV transmission cases. The likelihood estimators derived in this report, and the underlying probability distribution functions required to calculate them possess certain compelling general properties in the context of microbial forensics. These include the ability to quantify the significance of a sequence 'match' or 'mismatch' between two isolates; the ability to capture non-intuitive effects of network structure on inferential power, including the 'small world' effect; the insensitivity of inferences to uncertainties in the underlying distributions; and the concept of rescaling, i.e. ability to collapse sub-networks into single nodes and examine transmission inferences on the rescaled network.« less

  5. Introduction and Application of non-stationary Standardized Precipitation Index Considering Probability Distribution Function and Return Period

    NASA Astrophysics Data System (ADS)

    Park, J.; Lim, Y. J.; Sung, J. H.; Kang, H. S.

    2017-12-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI) basically assumes stationarity, but recent change in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process has been proposed. The results are evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the shape of probability distribution function wider than before. This understanding implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  6. Introduction and application of non-stationary standardized precipitation index considering probability distribution function and return period

    NASA Astrophysics Data System (ADS)

    Park, Junehyeong; Sung, Jang Hyun; Lim, Yoon-Jin; Kang, Hyun-Suk

    2018-05-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI), basically assumes stationarity, but recent changes in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process was proposed. The results were evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered that the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite that these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the probability distribution wider than before. This implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  7. Effect of noise on defect chaos in a reaction-diffusion model.

    PubMed

    Wang, Hongli; Ouyang, Qi

    2005-06-01

    The influence of noise on defect chaos due to breakup of spiral waves through Doppler and Eckhaus instabilities is investigated numerically with a modified Fitzhugh-Nagumo model. By numerical simulations we show that the noise can drastically enhance the creation and annihilation rates of topological defects. The noise-free probability distribution function for defects in this model is found not to fit with the previously reported squared-Poisson distribution. Under the influence of noise, the distributions are flattened, and can fit with the squared-Poisson or the modified-Poisson distribution. The defect lifetime and diffusive property of defects under the influence of noise are also checked in this model.

  8. Probabilistic Integrated Assessment of ``Dangerous'' Climate Change

    NASA Astrophysics Data System (ADS)

    Mastrandrea, Michael D.; Schneider, Stephen H.

    2004-04-01

    Climate policy decisions are being made despite layers of uncertainty. Such decisions directly influence the potential for ``dangerous anthropogenic interference with the climate system.'' We mapped a metric for this concept, based on Intergovernmental Panel on Climate Change assessment of climate impacts, onto probability distributions of future climate change produced from uncertainty in key parameters of the coupled social-natural system-climate sensitivity, climate damages, and discount rate. Analyses with a simple integrated assessment model found that, under midrange assumptions, endogenously calculated, optimal climate policy controls can reduce the probability of dangerous anthropogenic interference from ~45% under minimal controls to near zero.

  9. A short note on the maximal point-biserial correlation under non-normality.

    PubMed

    Cheng, Ying; Liu, Haiyan

    2016-11-01

    The aim of this paper is to derive the maximal point-biserial correlation under non-normality. Several widely used non-normal distributions are considered, namely the uniform distribution, t-distribution, exponential distribution, and a mixture of two normal distributions. Results show that the maximal point-biserial correlation, depending on the non-normal continuous variable underlying the binary manifest variable, may not be a function of p (the probability that the dichotomous variable takes the value 1), can be symmetric or non-symmetric around p = .5, and may still lie in the range from -1.0 to 1.0. Therefore researchers should exercise caution when they interpret their sample point-biserial correlation coefficients based on popular beliefs that the maximal point-biserial correlation is always smaller than 1, and that the size of the correlation is always further restricted as p deviates from .5. © 2016 The British Psychological Society.

  10. Bayesian network models for error detection in radiotherapy plans

    NASA Astrophysics Data System (ADS)

    Kalet, Alan M.; Gennari, John H.; Ford, Eric C.; Phillips, Mark H.

    2015-04-01

    The purpose of this study is to design and develop a probabilistic network for detecting errors in radiotherapy plans for use at the time of initial plan verification. Our group has initiated a multi-pronged approach to reduce these errors. We report on our development of Bayesian models of radiotherapy plans. Bayesian networks consist of joint probability distributions that define the probability of one event, given some set of other known information. Using the networks, we find the probability of obtaining certain radiotherapy parameters, given a set of initial clinical information. A low probability in a propagated network then corresponds to potential errors to be flagged for investigation. To build our networks we first interviewed medical physicists and other domain experts to identify the relevant radiotherapy concepts and their associated interdependencies and to construct a network topology. Next, to populate the network’s conditional probability tables, we used the Hugin Expert software to learn parameter distributions from a subset of de-identified data derived from a radiation oncology based clinical information database system. These data represent 4990 unique prescription cases over a 5 year period. Under test case scenarios with approximately 1.5% introduced error rates, network performance produced areas under the ROC curve of 0.88, 0.98, and 0.89 for the lung, brain and female breast cancer error detection networks, respectively. Comparison of the brain network to human experts performance (AUC of 0.90 ± 0.01) shows the Bayes network model performs better than domain experts under the same test conditions. Our results demonstrate the feasibility and effectiveness of comprehensive probabilistic models as part of decision support systems for improved detection of errors in initial radiotherapy plan verification procedures.

  11. Optimum space shuttle launch times relative to natural environment

    NASA Technical Reports Server (NTRS)

    King, R. L.

    1977-01-01

    The probabilities of favorable and unfavorable weather conditions for launch and landing of the STS under different criteria were computed for every three hours on a yearly basis using 14 years of weather data. These temporal probability distributions were considered for three sets of weather criteria encompassing benign, moderate and severe weather conditions for both Kennedy Space Center and for Edwards Air Force Base. In addition, the conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed. The probabilities were computed to indicate the significance of each weather element to the overall result.

  12. Scaling properties and universality of first-passage-time probabilities in financial markets

    NASA Astrophysics Data System (ADS)

    Perelló, Josep; Gutiérrez-Roig, Mario; Masoliver, Jaume

    2011-12-01

    Financial markets provide an ideal frame for the study of crossing or first-passage time events of non-Gaussian correlated dynamics, mainly because large data sets are available. Tick-by-tick data of six futures markets are herein considered, resulting in fat-tailed first-passage time probabilities. The scaling of the return with its standard deviation collapses the probabilities of all markets examined—and also for different time horizons—into single curves, suggesting that first-passage statistics is market independent (at least for high-frequency data). On the other hand, a very closely related quantity, the survival probability, shows, away from the center and tails of the distribution, a hyperbolic t-1/2 decay typical of a Markovian dynamics, albeit the existence of memory in markets. Modifications of the Weibull and Student distributions are good candidates for the phenomenological description of first-passage time properties under certain regimes. The scaling strategies shown may be useful for risk control and algorithmic trading.

  13. Disparity between online and offline tests in accelerated aging tests of LED lamps under electric stress.

    PubMed

    Wang, Yao; Jing, Lei; Ke, Hong-Liang; Hao, Jian; Gao, Qun; Wang, Xiao-Xun; Sun, Qiang; Xu, Zhi-Jun

    2016-09-20

    The accelerated aging tests under electric stress for one type of LED lamp are conducted, and the differences between online and offline tests of the degradation of luminous flux are studied in this paper. The transformation of the two test modes is achieved with an adjustable AC voltage stabilized power source. Experimental results show that the exponential fitting of the luminous flux degradation in online tests possesses a higher fitting degree for most lamps, and the degradation rate of the luminous flux by online tests is always lower than that by offline tests. Bayes estimation and Weibull distribution are used to calculate the failure probabilities under the accelerated voltages, and then the reliability of the lamps under rated voltage of 220 V is estimated by use of the inverse power law model. Results show that the relative error of the lifetime estimation by offline tests increases as the failure probability decreases, and it cannot be neglected when the failure probability is less than 1%. The relative errors of lifetime estimation are 7.9%, 5.8%, 4.2%, and 3.5%, at the failure probabilities of 0.1%, 1%, 5%, and 10%, respectively.

  14. Long-distance quantum key distribution with imperfect devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo Piparo, Nicoló; Razavi, Mohsen

    2014-12-04

    Quantum key distribution over probabilistic quantum repeaters is addressed. We compare, under practical assumptions, two such schemes in terms of their secure key generation rate per memory, R{sub QKD}. The two schemes under investigation are the one proposed by Duan et al. in [Nat. 414, 413 (2001)] and that of Sangouard et al. proposed in [Phys. Rev. A 76, 050301 (2007)]. We consider various sources of imperfections in the latter protocol, such as a nonzero double-photon probability for the source, dark count per pulse, channel loss and inefficiencies in photodetectors and memories, to find the rate for different nesting levels.more » We determine the maximum value of the double-photon probability beyond which it is not possible to share a secret key anymore. We find the crossover distance for up to three nesting levels. We finally compare the two protocols.« less

  15. Probability density function of non-reactive solute concentration in heterogeneous porous formations.

    PubMed

    Bellin, Alberto; Tonina, Daniele

    2007-10-30

    Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for the local concentration of conservative tracers migrating in heterogeneous aquifers. Our model accounts for dilution, mechanical mixing within the sampling volume and spreading due to formation heterogeneity. It is developed by modeling local concentration dynamics with an Ito Stochastic Differential Equation (SDE) that under the hypothesis of statistical stationarity leads to the Beta probability distribution function (pdf) for the solute concentration. This model shows large flexibility in capturing the smoothing effect of the sampling volume and the associated reduction of the probability of exceeding large concentrations. Furthermore, it is fully characterized by the first two moments of the solute concentration, and these are the same pieces of information required for standard geostatistical techniques employing Normal or Log-Normal distributions. Additionally, we show that in the absence of pore-scale dispersion and for point concentrations the pdf model converges to the binary distribution of [Dagan, G., 1982. Stochastic modeling of groundwater flow by unconditional and conditional probabilities, 2, The solute transport. Water Resour. Res. 18 (4), 835-848.], while it approaches the Normal distribution for sampling volumes much larger than the characteristic scale of the aquifer heterogeneity. Furthermore, we demonstrate that the same model with the spatial moments replacing the statistical moments can be applied to estimate the proportion of the plume volume where solute concentrations are above or below critical thresholds. Application of this model to point and vertically averaged bromide concentrations from the first Cape Cod tracer test and to a set of numerical simulations confirms the above findings and for the first time it shows the superiority of the Beta model to both Normal and Log-Normal models in interpreting field data. Furthermore, we show that assuming a-priori that local concentrations are normally or log-normally distributed may result in a severe underestimate of the probability of exceeding large concentrations.

  16. On the emergence of a generalised Gamma distribution. Application to traded volume in financial markets

    NASA Astrophysics Data System (ADS)

    Duarte Queirós, S. M.

    2005-08-01

    This letter reports on a stochastic dynamical scenario whose associated stationary probability density function is exactly a generalised form, with a power law instead of exponencial decay, of the ubiquitous Gamma distribution. This generalisation, also known as F-distribution, was empirically proposed for the first time to adjust for high-frequency stock traded volume distributions in financial markets and verified in experiments with granular material. The dynamical assumption presented herein is based on local temporal fluctuations of the average value of the observable under study. This proposal is related to superstatistics and thus to the current nonextensive statistical mechanics framework. For the specific case of stock traded volume, we connect the local fluctuations in the mean stock traded volume with the typical herding behaviour presented by financial traders. Last of all, NASDAQ 1 and 2 minute stock traded volume sequences and probability density functions are numerically reproduced.

  17. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  18. Encounter risk analysis of rainfall and reference crop evapotranspiration in the irrigation district

    NASA Astrophysics Data System (ADS)

    Zhang, Jinping; Lin, Xiaomin; Zhao, Yong; Hong, Yang

    2017-09-01

    Rainfall and reference crop evapotranspiration are random but mutually affected variables in the irrigation district, and their encounter situation can determine water shortage risks under the contexts of natural water supply and demand. However, in reality, the rainfall and reference crop evapotranspiration may have different marginal distributions and their relations are nonlinear. In this study, based on the annual rainfall and reference crop evapotranspiration data series from 1970 to 2013 in the Luhun irrigation district of China, the joint probability distribution of rainfall and reference crop evapotranspiration are developed with the Frank copula function. Using the joint probability distribution, the synchronous-asynchronous encounter risk, conditional joint probability, and conditional return period of different combinations of rainfall and reference crop evapotranspiration are analyzed. The results show that the copula-based joint probability distributions of rainfall and reference crop evapotranspiration are reasonable. The asynchronous encounter probability of rainfall and reference crop evapotranspiration is greater than their synchronous encounter probability, and the water shortage risk associated with meteorological drought (i.e. rainfall variability) is more prone to appear. Compared with other states, there are higher conditional joint probability and lower conditional return period in either low rainfall or high reference crop evapotranspiration. For a specifically high reference crop evapotranspiration with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is increased with the decrease in frequency. For a specifically low rainfall with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is decreased with the decrease in frequency. When either the high reference crop evapotranspiration exceeds a certain frequency or low rainfall does not exceed a certain frequency, the higher conditional joint probability and lower conditional return period of various combinations likely cause a water shortage, but the water shortage is not severe.

  19. Technical Reports Prepared Under Contract N00014-76-C-0475.

    DTIC Science & Technology

    1987-05-29

    264 Approximations to Densities in Geometric H. Solomon 10/27/78 Probability M.A. Stephens 3. Technical Relort No. Title Author Date 265 Sequential ...Certain Multivariate S. Iyengar 8/12/82 Normal Probabilities 323 EDF Statistics for Testing for the Gamma M.A. Stephens 8/13/82 Distribution with...20-85 Nets 360 Random Sequential Coding By Hamming Distance Yoshiaki Itoh 07-11-85 Herbert Solomon 361 Transforming Censored Samples And Testing Fit

  20. Probability elicitation to inform early health economic evaluations of new medical technologies: a case study in heart failure disease management.

    PubMed

    Cao, Qi; Postmus, Douwe; Hillege, Hans L; Buskens, Erik

    2013-06-01

    Early estimates of the commercial headroom available to a new medical device can assist producers of health technology in making appropriate product investment decisions. The purpose of this study was to illustrate how this quantity can be captured probabilistically by combining probability elicitation with early health economic modeling. The technology considered was a novel point-of-care testing device in heart failure disease management. First, we developed a continuous-time Markov model to represent the patients' disease progression under the current care setting. Next, we identified the model parameters that are likely to change after the introduction of the new device and interviewed three cardiologists to capture the probability distributions of these parameters. Finally, we obtained the probability distribution of the commercial headroom available per measurement by propagating the uncertainty in the model inputs to uncertainty in modeled outcomes. For a willingness-to-pay value of €10,000 per life-year, the median headroom available per measurement was €1.64 (interquartile range €0.05-€3.16) when the measurement frequency was assumed to be daily. In the subsequently conducted sensitivity analysis, this median value increased to a maximum of €57.70 for different combinations of the willingness-to-pay threshold and the measurement frequency. Probability elicitation can successfully be combined with early health economic modeling to obtain the probability distribution of the headroom available to a new medical technology. Subsequently feeding this distribution into a product investment evaluation method enables stakeholders to make more informed decisions regarding to which markets a currently available product prototype should be targeted. Copyright © 2013. Published by Elsevier Inc.

  1. A microcomputer program for energy assessment and aggregation using the triangular probability distribution

    USGS Publications Warehouse

    Crovelli, R.A.; Balay, R.H.

    1991-01-01

    A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.

  2. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  3. Ensemble Kalman filtering in presence of inequality constraints

    NASA Astrophysics Data System (ADS)

    van Leeuwen, P. J.

    2009-04-01

    Kalman filtering is presence of constraints is an active area of research. Based on the Gaussian assumption for the probability-density functions, it looks hard to bring in extra constraints in the formalism. On the other hand, in geophysical systems we often encounter constraints related to e.g. the underlying physics or chemistry, which are violated by the Gaussian assumption. For instance, concentrations are always non-negative, model layers have non-negative thickness, and sea-ice concentration is between 0 and 1. Several methods to bring inequality constraints into the Kalman-filter formalism have been proposed. One of them is probability density function (pdf) truncation, in which the Gaussian mass from the non-allowed part of the variables is just equally distributed over the pdf where the variables are alolwed, as proposed by Shimada et al. 1998. However, a problem with this method is that the probability that e.g. the sea-ice concentration is zero, is zero! The new method proposed here does not have this drawback. It assumes that the probability-density function is a truncated Gaussian, but the truncated mass is not distributed equally over all allowed values of the variables, but put into a delta distribution at the truncation point. This delta distribution can easily be handled with in Bayes theorem, leading to posterior probability density functions that are also truncated Gaussians with delta distributions at the truncation location. In this way a much better representation of the system is obtained, while still keeping most of the benefits of the Kalman-filter formalism. In the full Kalman filter the formalism is prohibitively expensive in large-scale systems, but efficient implementation is possible in ensemble variants of the kalman filter. Applications to low-dimensional systems and large-scale systems will be discussed.

  4. Probability density of aperture-averaged irradiance fluctuations for long range free space optical communication links.

    PubMed

    Lyke, Stephen D; Voelz, David G; Roggemann, Michael C

    2009-11-20

    The probability density function (PDF) of aperture-averaged irradiance fluctuations is calculated from wave-optics simulations of a laser after propagating through atmospheric turbulence to investigate the evolution of the distribution as the aperture diameter is increased. The simulation data distribution is compared to theoretical gamma-gamma and lognormal PDF models under a variety of scintillation regimes from weak to strong. Results show that under weak scintillation conditions both the gamma-gamma and lognormal PDF models provide a good fit to the simulation data for all aperture sizes studied. Our results indicate that in moderate scintillation the gamma-gamma PDF provides a better fit to the simulation data than the lognormal PDF for all aperture sizes studied. In the strong scintillation regime, the simulation data distribution is gamma gamma for aperture sizes much smaller than the coherence radius rho0 and lognormal for aperture sizes on the order of rho0 and larger. Examples of how these results affect the bit-error rate of an on-off keyed free space optical communication link are presented.

  5. Performance evaluation and bias correction of DBS measurements for a 1290-MHz boundary layer profiler.

    PubMed

    Liu, Zhao; Zheng, Chaorong; Wu, Yue

    2018-02-01

    Recently, the government installed a boundary layer profiler (BLP), which is operated under the Doppler beam swinging mode, in a coastal area of China, to acquire useful wind field information in the atmospheric boundary layer for several purposes. And under strong wind conditions, the performance of the BLP is evaluated. It is found that, even though the quality controlled BLP data show good agreement with the balloon observations, a systematic bias can always be found for the BLP data. For the low wind velocities, the BLP data tend to overestimate the atmospheric wind. However, with the increment of wind velocity, the BLP data show a tendency of underestimation. In order to remove the effect of poor quality data on bias correction, the probability distribution function of the differences between the two instruments is discussed, and it is found that the t location scale distribution is the most suitable probability model when compared to other probability models. After the outliers with a large discrepancy, which are outside of 95% confidence interval of the t location scale distribution, are discarded, the systematic bias can be successfully corrected using a first-order polynomial correction function. The methodology of bias correction used in the study not only can be referred for the correction of other wind profiling radars, but also can lay a solid basis for further analysis of the wind profiles.

  6. Performance evaluation and bias correction of DBS measurements for a 1290-MHz boundary layer profiler

    NASA Astrophysics Data System (ADS)

    Liu, Zhao; Zheng, Chaorong; Wu, Yue

    2018-02-01

    Recently, the government installed a boundary layer profiler (BLP), which is operated under the Doppler beam swinging mode, in a coastal area of China, to acquire useful wind field information in the atmospheric boundary layer for several purposes. And under strong wind conditions, the performance of the BLP is evaluated. It is found that, even though the quality controlled BLP data show good agreement with the balloon observations, a systematic bias can always be found for the BLP data. For the low wind velocities, the BLP data tend to overestimate the atmospheric wind. However, with the increment of wind velocity, the BLP data show a tendency of underestimation. In order to remove the effect of poor quality data on bias correction, the probability distribution function of the differences between the two instruments is discussed, and it is found that the t location scale distribution is the most suitable probability model when compared to other probability models. After the outliers with a large discrepancy, which are outside of 95% confidence interval of the t location scale distribution, are discarded, the systematic bias can be successfully corrected using a first-order polynomial correction function. The methodology of bias correction used in the study not only can be referred for the correction of other wind profiling radars, but also can lay a solid basis for further analysis of the wind profiles.

  7. Zipf 's law and the effect of ranking on probability distributions

    NASA Astrophysics Data System (ADS)

    Günther, R.; Levitin, L.; Schapiro, B.; Wagner, P.

    1996-02-01

    Ranking procedures are widely used in the description of many different types of complex systems. Zipf's law is one of the most remarkable frequency-rank relationships and has been observed independently in physics, linguistics, biology, demography, etc. We show that ranking plays a crucial role in making it possible to detect empirical relationships in systems that exist in one realization only, even when the statistical ensemble to which the systems belong has a very broad probability distribution. Analytical results and numerical simulations are presented which clarify the relations between the probability distributions and the behavior of expected values for unranked and ranked random variables. This analysis is performed, in particular, for the evolutionary model presented in our previous papers which leads to Zipf's law and reveals the underlying mechanism of this phenomenon in terms of a system with interdependent and interacting components as opposed to the “ideal gas” models suggested by previous researchers. The ranking procedure applied to this model leads to a new, unexpected phenomenon: a characteristic “staircase” behavior of the mean values of the ranked variables (ranked occupation numbers). This result is due to the broadness of the probability distributions for the occupation numbers and does not follow from the “ideal gas” model. Thus, it provides an opportunity, by comparison with empirical data, to obtain evidence as to which model relates to reality.

  8. Distribution of apparent activation energy counterparts during thermo - And thermo-oxidative degradation of Aronia melanocarpa (black chokeberry).

    PubMed

    Janković, Bojan; Marinović-Cincović, Milena; Janković, Marija

    2017-09-01

    Kinetics of degradation for Aronia melanocarpa fresh fruits in argon and air atmospheres were investigated. The investigation was based on probability distributions of apparent activation energy of counterparts (ε a ). Isoconversional analysis results indicated that the degradation process in an inert atmosphere was governed by decomposition reactions of esterified compounds. Also, based on same kinetics approach, it was assumed that in an air atmosphere, the primary compound in degradation pathways could be anthocyanins, which undergo rapid chemical reactions. A new model of reactivity demonstrated that, under inert atmospheres, expectation values for ε a occured at levels of statistical probability. These values corresponded to decomposition processes in which polyphenolic compounds might be involved. ε a values obeyed laws of binomial distribution. It was established that, for thermo-oxidative degradation, Poisson distribution represented a very successful approximation for ε a values where there was additional mechanistic complexity and the binomial distribution was no longer valid. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Population dynamical behavior of Lotka-Volterra system under regime switching

    NASA Astrophysics Data System (ADS)

    Li, Xiaoyue; Jiang, Daqing; Mao, Xuerong

    2009-10-01

    In this paper, we investigate a Lotka-Volterra system under regime switching where B(t) is a standard Brownian motion. The aim here is to find out what happens under regime switching. We first obtain the sufficient conditions for the existence of global positive solutions, stochastic permanence and extinction. We find out that both stochastic permanence and extinction have close relationships with the stationary probability distribution of the Markov chain. The limit of the average in time of the sample path of the solution is then estimated by two constants related to the stationary distribution and the coefficients. Finally, the main results are illustrated by several examples.

  10. Combining Probability Distributions of Wind Waves and Sea Level Variations to Assess Return Periods of Coastal Floods

    NASA Astrophysics Data System (ADS)

    Leijala, U.; Bjorkqvist, J. V.; Pellikka, H.; Johansson, M. M.; Kahma, K. K.

    2017-12-01

    Predicting the behaviour of the joint effect of sea level and wind waves is of great significance due to the major impact of flooding events in densely populated coastal regions. As mean sea level rises, the effect of sea level variations accompanied by the waves will be even more harmful in the future. The main challenge when evaluating the effect of waves and sea level variations is that long time series of both variables rarely exist. Wave statistics are also highly location-dependent, thus requiring wave buoy measurements and/or high-resolution wave modelling. As an initial approximation of the joint effect, the variables may be treated as independent random variables, to achieve the probability distribution of their sum. We present results of a case study based on three probability distributions: 1) wave run-up constructed from individual wave buoy measurements, 2) short-term sea level variability based on tide gauge data, and 3) mean sea level projections based on up-to-date regional scenarios. The wave measurements were conducted during 2012-2014 on the coast of city of Helsinki located in the Gulf of Finland in the Baltic Sea. The short-term sea level distribution contains the last 30 years (1986-2015) of hourly data from Helsinki tide gauge, and the mean sea level projections are scenarios adjusted for the Gulf of Finland. Additionally, we present a sensitivity test based on six different theoretical wave height distributions representing different wave behaviour in relation to sea level variations. As these wave distributions are merged with one common sea level distribution, we can study how the different shapes of the wave height distribution affect the distribution of the sum, and which one of the components is dominating under different wave conditions. As an outcome of the method, we obtain a probability distribution of the maximum elevation of the continuous water mass, which enables a flexible tool for evaluating different risk levels in the current and future climate.

  11. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    NASA Astrophysics Data System (ADS)

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-07-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept. The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation. No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice cover. For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of -9.0%, -21%, -8.6%, 17.8%, 3.6%, and -2.3%. This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.

  12. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    USGS Publications Warehouse

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-01-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept.The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation.No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice cover.For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of −9.0%, −21%, −8.6%, 17.8%, 3.6%, and −2.3%.This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.

  13. Valid statistical inference methods for a case-control study with missing data.

    PubMed

    Tian, Guo-Liang; Zhang, Chi; Jiang, Xuejun

    2018-04-01

    The main objective of this paper is to derive the valid sampling distribution of the observed counts in a case-control study with missing data under the assumption of missing at random by employing the conditional sampling method and the mechanism augmentation method. The proposed sampling distribution, called the case-control sampling distribution, can be used to calculate the standard errors of the maximum likelihood estimates of parameters via the Fisher information matrix and to generate independent samples for constructing small-sample bootstrap confidence intervals. Theoretical comparisons of the new case-control sampling distribution with two existing sampling distributions exhibit a large difference. Simulations are conducted to investigate the influence of the three different sampling distributions on statistical inferences. One finding is that the conclusion by the Wald test for testing independency under the two existing sampling distributions could be completely different (even contradictory) from the Wald test for testing the equality of the success probabilities in control/case groups under the proposed distribution. A real cervical cancer data set is used to illustrate the proposed statistical methods.

  14. Sampling--how big a sample?

    PubMed

    Aitken, C G

    1999-07-01

    It is thought that, in a consignment of discrete units, a certain proportion of the units contain illegal material. A sample of the consignment is to be inspected. Various methods for the determination of the sample size are compared. The consignment will be considered as a random sample from some super-population of units, a certain proportion of which contain drugs. For large consignments, a probability distribution, known as the beta distribution, for the proportion of the consignment which contains illegal material is obtained. This distribution is based on prior beliefs about the proportion. Under certain specific conditions the beta distribution gives the same numerical results as an approach based on the binomial distribution. The binomial distribution provides a probability for the number of units in a sample which contain illegal material, conditional on knowing the proportion of the consignment which contains illegal material. This is in contrast to the beta distribution which provides probabilities for the proportion of a consignment which contains illegal material, conditional on knowing the number of units in the sample which contain illegal material. The interpretation when the beta distribution is used is much more intuitively satisfactory. It is also much more flexible in its ability to cater for prior beliefs which may vary given the different circumstances of different crimes. For small consignments, a distribution, known as the beta-binomial distribution, for the number of units in the consignment which are found to contain illegal material, is obtained, based on prior beliefs about the number of units in the consignment which are thought to contain illegal material. As with the beta and binomial distributions for large samples, it is shown that, in certain specific conditions, the beta-binomial and hypergeometric distributions give the same numerical results. However, the beta-binomial distribution, as with the beta distribution, has a more intuitively satisfactory interpretation and greater flexibility. The beta and the beta-binomial distributions provide methods for the determination of the minimum sample size to be taken from a consignment in order to satisfy a certain criterion. The criterion requires the specification of a proportion and a probability.

  15. Tree mortality estimates and species distribution probabilities in southeastern United States forests

    Treesearch

    Martin A. Spetich; Zhaofei Fan; Zhen Sui; Michael Crosby; Hong S. He; Stephen R. Shifley; Theodor D. Leininger; W. Keith Moser

    2017-01-01

    Stresses to trees under a changing climate can lead to changes in forest tree survival, mortality and distribution.  For instance, a study examining the effects of human-induced climate change on forest biodiversity by Hansen and others (2001) predicted a 32% reduction in loblolly–shortleaf pine habitat across the eastern United States.  However, they also...

  16. TU-AB-BRB-01: Coverage Evaluation and Probabilistic Treatment Planning as a Margin Alternative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebers, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  17. TU-AB-BRB-03: Coverage-Based Treatment Planning to Accommodate Organ Deformable Motions and Contouring Uncertainties for Prostate Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, H.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  18. TU-AB-BRB-02: Stochastic Programming Methods for Handling Uncertainty and Motion in IMRT Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unkelbach, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  19. TU-AB-BRB-00: New Methods to Ensure Target Coverage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  20. Distributed Optimal Dispatch of Distributed Energy Resources Over Lossy Communication Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Junfeng; Yang, Tao; Wu, Di

    In this paper, we consider the economic dispatch problem (EDP), where a cost function that is assumed to be strictly convex is assigned to each of distributed energy resources (DERs), over packet dropping networks. The goal of a standard EDP is to minimize the total generation cost while meeting total demand and satisfying individual generator output limit. We propose a distributed algorithm for solving the EDP over networks. The proposed algorithm is resilient against packet drops over communication links. Under the assumption that the underlying communication network is strongly connected with a positive probability and the packet drops are independentmore » and identically distributed (i.i.d.), we show that the proposed algorithm is able to solve the EDP. Numerical simulation results are used to validate and illustrate the main results of the paper.« less

  1. Minimum relative entropy distributions with a large mean are Gaussian

    NASA Astrophysics Data System (ADS)

    Smerlak, Matteo

    2016-12-01

    Entropy optimization principles are versatile tools with wide-ranging applications from statistical physics to engineering to ecology. Here we consider the following constrained problem: Given a prior probability distribution q , find the posterior distribution p minimizing the relative entropy (also known as the Kullback-Leibler divergence) with respect to q under the constraint that mean (p ) is fixed and large. We show that solutions to this problem are approximately Gaussian. We discuss two applications of this result. In the context of dissipative dynamics, the equilibrium distribution of a Brownian particle confined in a strong external field is independent of the shape of the confining potential. We also derive an H -type theorem for evolutionary dynamics: The entropy of the (standardized) distribution of fitness of a population evolving under natural selection is eventually increasing in time.

  2. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  3. Neural Mechanisms for Integrating Prior Knowledge and Likelihood in Value-Based Probabilistic Inference

    PubMed Central

    Ting, Chih-Chung; Yu, Chia-Chen; Maloney, Laurence T.

    2015-01-01

    In Bayesian decision theory, knowledge about the probabilities of possible outcomes is captured by a prior distribution and a likelihood function. The prior reflects past knowledge and the likelihood summarizes current sensory information. The two combined (integrated) form a posterior distribution that allows estimation of the probability of different possible outcomes. In this study, we investigated the neural mechanisms underlying Bayesian integration using a novel lottery decision task in which both prior knowledge and likelihood information about reward probability were systematically manipulated on a trial-by-trial basis. Consistent with Bayesian integration, as sample size increased, subjects tended to weigh likelihood information more compared with prior information. Using fMRI in humans, we found that the medial prefrontal cortex (mPFC) correlated with the mean of the posterior distribution, a statistic that reflects the integration of prior knowledge and likelihood of reward probability. Subsequent analysis revealed that both prior and likelihood information were represented in mPFC and that the neural representations of prior and likelihood in mPFC reflected changes in the behaviorally estimated weights assigned to these different sources of information in response to changes in the environment. Together, these results establish the role of mPFC in prior-likelihood integration and highlight its involvement in representing and integrating these distinct sources of information. PMID:25632152

  4. Modeling nonbreeding distributions of shorebirds and waterfowl in response to climate change

    USGS Publications Warehouse

    Reese, Gordon; Skagen, Susan K.

    2017-01-01

    To identify areas on the landscape that may contribute to a robust network of conservation areas, we modeled the probabilities of occurrence of several en route migratory shorebirds and wintering waterfowl in the southern Great Plains of North America, including responses to changing climate. We predominantly used data from the eBird citizen-science project to model probabilities of occurrence relative to land-use patterns, spatial distribution of wetlands, and climate. We projected models to potential future climate conditions using five representative general circulation models of the Coupled Model Intercomparison Project 5 (CMIP5). We used Random Forests to model probabilities of occurrence and compared the time periods 1981–2010 (hindcast) and 2041–2070 (forecast) in “model space.” Projected changes in shorebird probabilities of occurrence varied with species-specific general distribution pattern, migration distance, and spatial extent. Species using the western and northern portion of the study area exhibited the greatest likelihoods of decline, whereas species with more easterly occurrences, mostly long-distance migrants, had the greatest projected increases in probability of occurrence. At an ecoregional extent, differences in probabilities of shorebird occurrence ranged from −0.015 to 0.045 when averaged across climate models, with the largest increases occurring early in migration. Spatial shifts are predicted for several shorebird species. Probabilities of occurrence of wintering Mallards and Northern Pintail are predicted to increase by 0.046 and 0.061, respectively, with northward shifts projected for both species. When incorporated into partner land management decision tools, results at ecoregional extents can be used to identify wetland complexes with the greatest potential to support birds in the nonbreeding season under a wide range of future climate scenarios.

  5. Precipitation Cluster Distributions: Current Climate Storm Statistics and Projected Changes Under Global Warming

    NASA Astrophysics Data System (ADS)

    Quinn, Kevin Martin

    The total amount of precipitation integrated across a precipitation cluster (contiguous precipitating grid cells exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance, expressed as the rate of water mass lost or latent heat released, i.e. the power of the disturbance. Probability distributions of cluster power are examined during boreal summer (May-September) and winter (January-March) using satellite-retrieved rain rates from the Tropical Rainfall Measuring Mission (TRMM) 3B42 and Special Sensor Microwave Imager and Sounder (SSM/I and SSMIS) programs, model output from the High Resolution Atmospheric Model (HIRAM, roughly 0.25-0.5 0 resolution), seven 1-2° resolution members of the Coupled Model Intercomparison Project Phase 5 (CMIP5) experiment, and National Center for Atmospheric Research Large Ensemble (NCAR LENS). Spatial distributions of precipitation-weighted centroids are also investigated in observations (TRMM-3B42) and climate models during winter as a metric for changes in mid-latitude storm tracks. Observed probability distributions for both seasons are scale-free from the smallest clusters up to a cutoff scale at high cluster power, after which the probability density drops rapidly. When low rain rates are excluded by choosing a minimum rain rate threshold in defining clusters, the models accurately reproduce observed cluster power statistics and winter storm tracks. Changes in behavior in the tail of the distribution, above the cutoff, are important for impacts since these quantify the frequency of the most powerful storms. End-of-century cluster power distributions and storm track locations are investigated in these models under a "business as usual" global warming scenario. The probability of high cluster power events increases by end-of-century across all models, by up to an order of magnitude for the highest-power events for which statistics can be computed. For the three models in the suite with continuous time series of high resolution output, there is substantial variability on when these probability increases for the most powerful precipitation clusters become detectable, ranging from detectable within the observational period to statistically significant trends emerging only after 2050. A similar analysis of National Centers for Environmental Prediction (NCEP) Reanalysis 2 and SSM/I-SSMIS rain rate retrievals in the recent observational record does not yield reliable evidence of trends in high-power cluster probabilities at this time. Large impacts to mid-latitude storm tracks are projected over the West Coast and eastern North America, with no less than 8 of the 9 models examined showing large increases by end-of-century in the probability density of the most powerful storms, ranging up to a factor of 6.5 in the highest range bin for which historical statistics are computed. However, within these regional domains, there is considerable variation among models in pinpointing exactly where the largest increases will occur.

  6. Changes in tropical precipitation cluster size distributions under global warming

    NASA Astrophysics Data System (ADS)

    Neelin, J. D.; Quinn, K. M.

    2016-12-01

    The total amount of precipitation integrated across a tropical storm or other precipitation feature (contiguous clusters of precipitation exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance. To establish baseline behavior in current climate, the probability distribution of cluster sizes from multiple satellite retrievals and National Center for Environmental Prediction (NCEP) reanalysis is compared to those from Coupled Model Intercomparison Project (CMIP5) models and the Geophysical Fluid Dynamics Laboratory high-resolution atmospheric model (HIRAM-360 and -180). With the caveat that a minimum rain rate threshold is important in the models (which tend to overproduce low rain rates), the models agree well with observations in leading properties. In particular, scale-free power law ranges in which the probability drops slowly with increasing cluster size are well modeled, followed by a rapid drop in probability of the largest clusters above a cutoff scale. Under the RCP 8.5 global warming scenario, the models indicate substantial increases in probability (up to an order of magnitude) of the largest clusters by the end of century. For models with continuous time series of high resolution output, there is substantial spread on when these probability increases for the largest precipitation clusters should be detectable, ranging from detectable within the observational period to statistically significant trends emerging only in the second half of the century. Examination of NCEP reanalysis and SSMI/SSMIS series of satellite retrievals from 1979 to present does not yield reliable evidence of trends at this time. The results suggest improvements in inter-satellite calibration of the SSMI/SSMIS retrievals could aid future detection.

  7. Reliability estimation of a N- M-cold-standby redundancy system in a multicomponent stress-strength model with generalized half-logistic distribution

    NASA Astrophysics Data System (ADS)

    Liu, Yiming; Shi, Yimin; Bai, Xuchao; Zhan, Pei

    2018-01-01

    In this paper, we study the estimation for the reliability of a multicomponent system, named N- M-cold-standby redundancy system, based on progressive Type-II censoring sample. In the system, there are N subsystems consisting of M statistically independent distributed strength components, and only one of these subsystems works under the impact of stresses at a time and the others remain as standbys. Whenever the working subsystem fails, one from the standbys takes its place. The system fails when the entire subsystems fail. It is supposed that the underlying distributions of random strength and stress both belong to the generalized half-logistic distribution with different shape parameter. The reliability of the system is estimated by using both classical and Bayesian statistical inference. Uniformly minimum variance unbiased estimator and maximum likelihood estimator for the reliability of the system are derived. Under squared error loss function, the exact expression of the Bayes estimator for the reliability of the system is developed by using the Gauss hypergeometric function. The asymptotic confidence interval and corresponding coverage probabilities are derived based on both the Fisher and the observed information matrices. The approximate highest probability density credible interval is constructed by using Monte Carlo method. Monte Carlo simulations are performed to compare the performances of the proposed reliability estimators. A real data set is also analyzed for an illustration of the findings.

  8. Face Recognition for Access Control Systems Combining Image-Difference Features Based on a Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Miwa, Shotaro; Kage, Hiroshi; Hirai, Takashi; Sumi, Kazuhiko

    We propose a probabilistic face recognition algorithm for Access Control System(ACS)s. Comparing with existing ACSs using low cost IC-cards, face recognition has advantages in usability and security that it doesn't require people to hold cards over scanners and doesn't accept imposters with authorized cards. Therefore face recognition attracts more interests in security markets than IC-cards. But in security markets where low cost ACSs exist, price competition is important, and there is a limitation on the quality of available cameras and image control. Therefore ACSs using face recognition are required to handle much lower quality images, such as defocused and poor gain-controlled images than high security systems, such as immigration control. To tackle with such image quality problems we developed a face recognition algorithm based on a probabilistic model which combines a variety of image-difference features trained by Real AdaBoost with their prior probability distributions. It enables to evaluate and utilize only reliable features among trained ones during each authentication, and achieve high recognition performance rates. The field evaluation using a pseudo Access Control System installed in our office shows that the proposed system achieves a constant high recognition performance rate independent on face image qualities, that is about four times lower EER (Equal Error Rate) under a variety of image conditions than one without any prior probability distributions. On the other hand using image difference features without any prior probabilities are sensitive to image qualities. We also evaluated PCA, and it has worse, but constant performance rates because of its general optimization on overall data. Comparing with PCA, Real AdaBoost without any prior distribution performs twice better under good image conditions, but degrades to a performance as good as PCA under poor image conditions.

  9. Software Supportability Risk Assessment in OT&E (Operational Test and Evaluation): Literature Review, Current Research Review, and Data Base Assemblage.

    DTIC Science & Technology

    1984-09-28

    variables before simula- tion of model - Search for reality checks a, - Express uncertainty as a probability density distribution. a. H2 a, H-22 TWIF... probability that the software con- tains errors. This prior is updated as test failure data are accumulated. Only a p of 1 (software known to contain...discusssed; both parametric and nonparametric versions are presented. It is shown by the author that the bootstrap underlies the jackknife method and

  10. Kolmogorov-Smirnov test for spatially correlated data

    USGS Publications Warehouse

    Olea, R.A.; Pawlowsky-Glahn, V.

    2009-01-01

    The Kolmogorov-Smirnov test is a convenient method for investigating whether two underlying univariate probability distributions can be regarded as undistinguishable from each other or whether an underlying probability distribution differs from a hypothesized distribution. Application of the test requires that the sample be unbiased and the outcomes be independent and identically distributed, conditions that are violated in several degrees by spatially continuous attributes, such as topographical elevation. A generalized form of the bootstrap method is used here for the purpose of modeling the distribution of the statistic D of the Kolmogorov-Smirnov test. The innovation is in the resampling, which in the traditional formulation of bootstrap is done by drawing from the empirical sample with replacement presuming independence. The generalization consists of preparing resamplings with the same spatial correlation as the empirical sample. This is accomplished by reading the value of unconditional stochastic realizations at the sampling locations, realizations that are generated by simulated annealing. The new approach was tested by two empirical samples taken from an exhaustive sample closely following a lognormal distribution. One sample was a regular, unbiased sample while the other one was a clustered, preferential sample that had to be preprocessed. Our results show that the p-value for the spatially correlated case is always larger that the p-value of the statistic in the absence of spatial correlation, which is in agreement with the fact that the information content of an uncorrelated sample is larger than the one for a spatially correlated sample of the same size. ?? Springer-Verlag 2008.

  11. A missing dimension in measures of vaccination impacts

    USGS Publications Warehouse

    Gomes, M. Gabriela M.; Lipsitch, Marc; Wargo, Andrew R.; Kurath, Gael; Rebelo, Carlota; Medley, Graham F.; Coutinho, Antonio

    2013-01-01

    Immunological protection, acquired from either natural infection or vaccination, varies among hosts, reflecting underlying biological variation and affecting population-level protection. Owing to the nature of resistance mechanisms, distributions of susceptibility and protection entangle with pathogen dose in a way that can be decoupled by adequately representing the dose dimension. Any infectious processes must depend in some fashion on dose, and empirical evidence exists for an effect of exposure dose on the probability of transmission to mumps-vaccinated hosts [1], the case-fatality ratio of measles [2], and the probability of infection and, given infection, of symptoms in cholera [3]. Extreme distributions of vaccine protection have been termed leaky (partially protects all hosts) and all-or-nothing (totally protects a proportion of hosts) [4]. These distributions can be distinguished in vaccine field trials from the time dependence of infections [5]. Frailty mixing models have also been proposed to estimate the distribution of protection from time to event data [6], [7], although the results are not comparable across regions unless there is explicit control for baseline transmission [8]. Distributions of host susceptibility and acquired protection can be estimated from dose-response data generated under controlled experimental conditions [9]–[11] and natural settings [12], [13]. These distributions can guide research on mechanisms of protection, as well as enable model validity across the entire range of transmission intensities. We argue for a shift to a dose-dimension paradigm in infectious disease science and community health.

  12. Metocean design parameter estimation for fixed platform based on copula functions

    NASA Astrophysics Data System (ADS)

    Zhai, Jinjin; Yin, Qilin; Dong, Sheng

    2017-08-01

    Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.

  13. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  14. Evaluation of carotid plaque echogenicity based on the integral of the cumulative probability distribution using gray-scale ultrasound images.

    PubMed

    Huang, Xiaowei; Zhang, Yanling; Meng, Long; Abbott, Derek; Qian, Ming; Wong, Kelvin K L; Zheng, Rongqing; Zheng, Hairong; Niu, Lili

    2017-01-01

    Carotid plaque echogenicity is associated with the risk of cardiovascular events. Gray-scale median (GSM) of the ultrasound image of carotid plaques has been widely used as an objective method for evaluation of plaque echogenicity in patients with atherosclerosis. We proposed a computer-aided method to evaluate plaque echogenicity and compared its efficiency with GSM. One hundred and twenty-five carotid plaques (43 echo-rich, 35 intermediate, 47 echolucent) were collected from 72 patients in this study. The cumulative probability distribution curves were obtained based on statistics of the pixels in the gray-level images of plaques. The area under the cumulative probability distribution curve (AUCPDC) was calculated as its integral value to evaluate plaque echogenicity. The classification accuracy for three types of plaques is 78.4% (kappa value, κ = 0.673), when the AUCPDC is used for classifier training, whereas GSM is 64.8% (κ = 0.460). The receiver operating characteristic curves were produced to test the effectiveness of AUCPDC and GSM for the identification of echolucent plaques. The area under the curve (AUC) was 0.817 when AUCPDC was used for training the classifier, which is higher than that achieved using GSM (AUC = 0.746). Compared with GSM, the AUCPDC showed a borderline association with coronary heart disease (Spearman r = 0.234, p = 0.050). Our experimental results suggest that AUCPDC analysis is a promising method for evaluation of plaque echogenicity and predicting cardiovascular events in patients with plaques.

  15. Historiography and History of Information Science (SIG HFIS)

    ERIC Educational Resources Information Center

    Breitenstein, Mikel

    2000-01-01

    Presents abstracts of papers for a planned session dealing with the historiography and history of information science. Highlights include probability distributions underlying the use of library materials, particularly scientific journals; the temporal and historical orientation of the rhetoric of information science; and concepts of information…

  16. The Geothermal Probabilistic Cost Model with an Application to a Geothermal Reservoir at Heber, California

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.

    1981-01-01

    A financial accounting model that incorporates physical and institutional uncertainties was developed for geothermal projects. Among the uncertainties it can handle are well depth, flow rate, fluid temperature, and permit and construction times. The outputs of the model are cumulative probability distributions of financial measures such as capital cost, levelized cost, and profit. These outputs are well suited for use in an investment decision incorporating risk. The model has the powerful feature that conditional probability distribution can be used to account for correlations among any of the input variables. The model has been applied to a geothermal reservoir at Heber, California, for a 45-MW binary electric plant. Under the assumptions made, the reservoir appears to be economically viable.

  17. The emergence of different tail exponents in the distributions of firm size variables

    NASA Astrophysics Data System (ADS)

    Ishikawa, Atushi; Fujimoto, Shouji; Watanabe, Tsutomu; Mizuno, Takayuki

    2013-05-01

    We discuss a mechanism through which inversion symmetry (i.e., invariance of a joint probability density function under the exchange of variables) and Gibrat’s law generate power-law distributions with different tail exponents. Using a dataset of firm size variables, that is, tangible fixed assets K, the number of workers L, and sales Y, we confirm that these variables have power-law tails with different exponents, and that inversion symmetry and Gibrat’s law hold. Based on these findings, we argue that there exists a plane in the three dimensional space (logK,logL,logY), with respect to which the joint probability density function for the three variables is invariant under the exchange of variables. We provide empirical evidence suggesting that this plane fits the data well, and argue that the plane can be interpreted as the Cobb-Douglas production function, which has been extensively used in various areas of economics since it was first introduced almost a century ago.

  18. Nonparametric Density Estimation Based on Self-Organizing Incremental Neural Network for Large Noisy Data.

    PubMed

    Nakamura, Yoshihiro; Hasegawa, Osamu

    2017-01-01

    With the ongoing development and expansion of communication networks and sensors, massive amounts of data are continuously generated in real time from real environments. Beforehand, prediction of a distribution underlying such data is difficult; furthermore, the data include substantial amounts of noise. These factors make it difficult to estimate probability densities. To handle these issues and massive amounts of data, we propose a nonparametric density estimator that rapidly learns data online and has high robustness. Our approach is an extension of both kernel density estimation (KDE) and a self-organizing incremental neural network (SOINN); therefore, we call our approach KDESOINN. An SOINN provides a clustering method that learns about the given data as networks of prototype of data; more specifically, an SOINN can learn the distribution underlying the given data. Using this information, KDESOINN estimates the probability density function. The results of our experiments show that KDESOINN outperforms or achieves performance comparable to the current state-of-the-art approaches in terms of robustness, learning time, and accuracy.

  19. How to Assess the Existence of Competing Strategies in Cognitive Tasks: A Primer on the Fixed-Point Property

    PubMed Central

    van Maanen, Leendert; de Jong, Ritske; van Rijn, Hedderik

    2014-01-01

    When multiple strategies can be used to solve a type of problem, the observed response time distributions are often mixtures of multiple underlying base distributions each representing one of these strategies. For the case of two possible strategies, the observed response time distributions obey the fixed-point property. That is, there exists one reaction time that has the same probability of being observed irrespective of the actual mixture proportion of each strategy. In this paper we discuss how to compute this fixed-point, and how to statistically assess the probability that indeed the observed response times are generated by two competing strategies. Accompanying this paper is a free R package that can be used to compute and test the presence or absence of the fixed-point property in response time data, allowing for easy to use tests of strategic behavior. PMID:25170893

  20. The potential effects of climate change on the distribution and productivity of Cunninghamia lanceolata in China.

    PubMed

    Liu, Yupeng; Yu, Deyong; Xun, Bin; Sun, Yun; Hao, Ruifang

    2014-01-01

    Climate changes may have immediate implications for forest productivity and may produce dramatic shifts in tree species distributions in the future. Quantifying these implications is significant for both scientists and managers. Cunninghamia lanceolata is an important coniferous timber species due to its fast growth and wide distribution in China. This paper proposes a methodology aiming at enhancing the distribution and productivity of C. lanceolata against a background of climate change. First, we simulated the potential distributions and establishment probabilities of C. lanceolata based on a species distribution model. Second, a process-based model, the PnET-II model, was calibrated and its parameterization of water balance improved. Finally, the improved PnET-II model was used to simulate the net primary productivity (NPP) of C. lanceolata. The simulated NPP and potential distribution were combined to produce an integrated indicator, the estimated total NPP, which serves to comprehensively characterize the productivity of the forest under climate change. The results of the analysis showed that (1) the distribution of C. lanceolata will increase in central China, but the mean probability of establishment will decrease in the 2050s; (2) the PnET-II model was improved, calibrated, and successfully validated for the simulation of the NPP of C. lanceolata in China; and (3) all scenarios predicted a reduction in total NPP in the 2050s, with a markedly lower reduction under the a2 scenario than under the b2 scenario. The changes in NPP suggested that forest productivity will show a large decrease in southern China and a mild increase in central China. All of these findings could improve our understanding of the impact of climate change on forest ecosystem structure and function and could provide a basis for policy-makers to apply adaptive measures and overcome the unfavorable influences of climate change.

  1. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis.

    PubMed

    Chiba, Tomoaki; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru

    2017-01-01

    In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group's sales beat GM's sales, which is a reasonable scenario.

  2. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis

    PubMed Central

    Chiba, Tomoaki; Akaho, Shotaro; Murata, Noboru

    2017-01-01

    In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group’s sales beat GM’s sales, which is a reasonable scenario. PMID:28076383

  3. The stochastic distribution of available coefficient of friction for human locomotion of five different floor surfaces.

    PubMed

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2014-05-01

    The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Statistical description of non-Gaussian samples in the F2 layer of the ionosphere during heliogeophysical disturbances

    NASA Astrophysics Data System (ADS)

    Sergeenko, N. P.

    2017-11-01

    An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.

  5. Distributed Synchronization in Networks of Agent Systems With Nonlinearities and Random Switchings.

    PubMed

    Tang, Yang; Gao, Huijun; Zou, Wei; Kurths, Jürgen

    2013-02-01

    In this paper, the distributed synchronization problem of networks of agent systems with controllers and nonlinearities subject to Bernoulli switchings is investigated. Controllers and adaptive updating laws injected in each vertex of networks depend on the state information of its neighborhood. Three sets of Bernoulli stochastic variables are introduced to describe the occurrence probabilities of distributed adaptive controllers, updating laws and nonlinearities, respectively. By the Lyapunov functions method, we show that the distributed synchronization of networks composed of agent systems with multiple randomly occurring nonlinearities, multiple randomly occurring controllers, and multiple randomly occurring updating laws can be achieved in mean square under certain criteria. The conditions derived in this paper can be solved by semi-definite programming. Moreover, by mathematical analysis, we find that the coupling strength, the probabilities of the Bernoulli stochastic variables, and the form of nonlinearities have great impacts on the convergence speed and the terminal control strength. The synchronization criteria and the observed phenomena are demonstrated by several numerical simulation examples. In addition, the advantage of distributed adaptive controllers over conventional adaptive controllers is illustrated.

  6. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    PubMed

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  7. Alternative Derivations of the Statistical Mechanical Distribution Laws

    PubMed Central

    Wall, Frederick T.

    1971-01-01

    A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems. PMID:16578712

  8. Alternative derivations of the statistical mechanical distribution laws.

    PubMed

    Wall, F T

    1971-08-01

    A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems.

  9. NEWTPOIS- NEWTON POISSON DISTRIBUTION PROGRAM

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative poisson distribution program, NEWTPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714), can be used independently of one another. NEWTPOIS determines percentiles for gamma distributions with integer shape parameters and calculates percentiles for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. NEWTPOIS determines the Poisson parameter (lambda), that is; the mean (or expected) number of events occurring in a given unit of time, area, or space. Given that the user already knows the cumulative probability for a specific number of occurrences (n) it is usually a simple matter of substitution into the Poisson distribution summation to arrive at lambda. However, direct calculation of the Poisson parameter becomes difficult for small positive values of n and unmanageable for large values. NEWTPOIS uses Newton's iteration method to extract lambda from the initial value condition of the Poisson distribution where n=0, taking successive estimations until some user specified error term (epsilon) is reached. The NEWTPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting epsilon, n, and the cumulative probability of the occurrence of n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 30K. NEWTPOIS was developed in 1988.

  10. CUMPOIS- CUMULATIVE POISSON DISTRIBUTION PROGRAM

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The Cumulative Poisson distribution program, CUMPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), can be used independently of one another. CUMPOIS determines the approximate cumulative binomial distribution, evaluates the cumulative distribution function (cdf) for gamma distributions with integer shape parameters, and evaluates the cdf for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. CUMPOIS calculates the probability that n or less events (ie. cumulative) will occur within any unit when the expected number of events is given as lambda. Normally, this probability is calculated by a direct summation, from i=0 to n, of terms involving the exponential function, lambda, and inverse factorials. This approach, however, eventually fails due to underflow for sufficiently large values of n. Additionally, when the exponential term is moved outside of the summation for simplification purposes, there is a risk that the terms remaining within the summation, and the summation itself, will overflow for certain values of i and lambda. CUMPOIS eliminates these possibilities by multiplying an additional exponential factor into the summation terms and the partial sum whenever overflow/underflow situations threaten. The reciprocal of this term is then multiplied into the completed sum giving the cumulative probability. The CUMPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting lambda and n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMPOIS was developed in 1988.

  11. Superconducting surface impedance under radiofrequency field

    DOE PAGES

    Xiao, Binping P.; Reece, Charles E.; Kelley, Michael J.

    2013-04-26

    Based on BCS theory with moving Cooper pairs, the electron states distribution at 0K and the probability of electron occupation with finite temperature have been derived and applied to anomalous skin effect theory to obtain the surface impedance of a superconductor under radiofrequency (RF) field. We present the numerical results for Nb and compare these with representative RF field-dependent effective surface resistance measurements from a 1.5 GHz resonant structure.

  12. Scaling and clustering effects of extreme precipitation distributions

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Zhou, Yu; Singh, Vijay P.; Li, Jianfeng

    2012-08-01

    SummaryOne of the impacts of climate change and human activities on the hydrological cycle is the change in the precipitation structure. Closely related to the precipitation structure are two characteristics: the volume (m) of wet periods (WPs) and the time interval between WPs or waiting time (t). Using daily precipitation data for a period of 1960-2005 from 590 rain gauge stations in China, these two characteristics are analyzed, involving scaling and clustering of precipitation episodes. Our findings indicate that m and t follow similar probability distribution curves, implying that precipitation processes are controlled by similar underlying thermo-dynamics. Analysis of conditional probability distributions shows a significant dependence of m and t on their previous values of similar volumes, and the dependence tends to be stronger when m is larger or t is longer. It indicates that a higher probability can be expected when high-intensity precipitation is followed by precipitation episodes with similar precipitation intensity and longer waiting time between WPs is followed by the waiting time of similar duration. This result indicates the clustering of extreme precipitation episodes and severe droughts or floods are apt to occur in groups.

  13. Generative adversarial networks for brain lesion detection

    NASA Astrophysics Data System (ADS)

    Alex, Varghese; Safwan, K. P. Mohammed; Chennamsetty, Sai Saketh; Krishnamurthi, Ganapathy

    2017-02-01

    Manual segmentation of brain lesions from Magnetic Resonance Images (MRI) is cumbersome and introduces errors due to inter-rater variability. This paper introduces a semi-supervised technique for detection of brain lesion from MRI using Generative Adversarial Networks (GANs). GANs comprises of a Generator network and a Discriminator network which are trained simultaneously with the objective of one bettering the other. The networks were trained using non lesion patches (n=13,000) from 4 different MR sequences. The network was trained on BraTS dataset and patches were extracted from regions excluding tumor region. The Generator network generates data by modeling the underlying probability distribution of the training data, (PData). The Discriminator learns the posterior probability P (Label Data) by classifying training data and generated data as "Real" or "Fake" respectively. The Generator upon learning the joint distribution, produces images/patches such that the performance of the Discriminator on them are random, i.e. P (Label Data = GeneratedData) = 0.5. During testing, the Discriminator assigns posterior probability values close to 0.5 for patches from non lesion regions, while patches centered on lesion arise from a different distribution (PLesion) and hence are assigned lower posterior probability value by the Discriminator. On the test set (n=14), the proposed technique achieves whole tumor dice score of 0.69, sensitivity of 91% and specificity of 59%. Additionally the generator network was capable of generating non lesion patches from various MR sequences.

  14. Biomechanical Tolerance of Calcaneal Fractures

    PubMed Central

    Yoganandan, Narayan; Pintar, Frank A.; Gennarelli, Thomas A.; Seipel, Robert; Marks, Richard

    1999-01-01

    Biomechanical studies have been conducted in the past to understand the mechanisms of injury to the foot-ankle complex. However, statistically based tolerance criteria for calcaneal complex injuries are lacking. Consequently, this research was designed to derive a probability distribution that represents human calcaneal tolerance under impact loading such as those encountered in vehicular collisions. Information for deriving the distribution was obtained by experiments on unembalmed human cadaver lower extremities. Briefly, the protocol included the following. The knee joint was disarticulated such that the entire lower extremity distal to the knee joint remained intact. The proximal tibia was fixed in polymethylmethacrylate. The specimens were aligned and impact loading was applied using mini-sled pendulum equipment. The pendulum impactor dynamically loaded the plantar aspect of the foot once. Following the test, specimens were palpated and radiographs in multiple planes were obtained. Injuries were classified into no fracture, and extra-and intra-articular fractures of the calcaneus. There were 14 cases of no injury and 12 cases of calcaneal fracture. The fracture forces (mean: 7802 N) were significantly different (p<0.01) from the forces in the no injury (mean: 4144 N) group. The probability of calcaneal fracture determined using logistic regression indicated that a force of 6.2 kN corresponds to 50 percent probability of calcaneal fracture. The derived probability distribution is useful in the design of dummies and vehicular surfaces.

  15. A Multivariate and Probabilistic Assessment of Drought in the Pacific Northwest under Observed and Future Climate.

    NASA Astrophysics Data System (ADS)

    Mortuza, M. R.; Demissie, Y. K.

    2015-12-01

    In lieu with the recent and anticipated more server and frequently droughts incidences in Yakima River Basin (YRB), a reliable and comprehensive drought assessment is deemed necessary to avoid major crop production loss and better manage the water right issues in the region during low precipitation and/or snow accumulation years. In this study, we have conducted frequency analysis of hydrological droughts and quantified associated uncertainty in the YRB under both historical and changing climate. Streamflow drought index (SDI) was employed to identify mutually correlated drought characteristics (e.g., severity, duration and peak). The historical and future characteristics of drought were estimated by applying tri-variate copulas probability distribution, which effectively describe the joint distribution and dependence of drought severity, duration, and peak. The associated prediction uncertainty, related to parameters of the joint probability and climate projections, were evaluated using the Bayesian approach with bootstrap resampling. For the climate change scenarios, two future representative pathways (RCP4.5 and RCP8.5) from University of Idaho's Multivariate Adaptive Constructed Analogs (MACA) database were considered. The results from the study are expected to provide useful information towards drought risk management in YRB under anticipated climate changes.

  16. Prospect theory on the brain? Toward a cognitive neuroscience of decision under risk.

    PubMed

    Trepel, Christopher; Fox, Craig R; Poldrack, Russell A

    2005-04-01

    Most decisions must be made without advance knowledge of their consequences. Economists and psychologists have devoted much attention to modeling decisions made under conditions of risk in which options can be characterized by a known probability distribution over possible outcomes. The descriptive shortcomings of classical economic models motivated the development of prospect theory (D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica, 4 (1979) 263-291; A. Tversky, D. Kahneman, Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4) (1992) 297-323) the most successful behavioral model of decision under risk. In the prospect theory, subjective value is modeled by a value function that is concave for gains, convex for losses, and steeper for losses than for gains; the impact of probabilities are characterized by a weighting function that overweights low probabilities and underweights moderate to high probabilities. We outline the possible neural bases of the components of prospect theory, surveying evidence from human imaging, lesion, and neuropharmacology studies as well as animal neurophysiology studies. These results provide preliminary suggestions concerning the neural bases of prospect theory that include a broad set of brain regions and neuromodulatory systems. These data suggest that focused studies of decision making in the context of quantitative models may provide substantial leverage towards a fuller understanding of the cognitive neuroscience of decision making.

  17. Urban Fire Simulation. Version 2

    DTIC Science & Technology

    1993-02-01

    of the building. In this case the distribution of windows in the tract per floor ( DWPF (FLOORHT)) is calculated under the assumption that the number of...given urban area. The probability that no room on the subject floor will flash over is calculated at label (V) from PNRFOF DWPF (FLOORHT) (1 - FFORF

  18. Probability of detecting perchlorate under natural conditions in deep groundwater in California and the Southwestern United States

    USGS Publications Warehouse

    Fram, Miranda S.; Belitz, Kenneth

    2011-01-01

    We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 μg/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 μg/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 μg/L under natural conditions is low (generally <3%).

  19. Investigation of Dielectric Breakdown Characteristics for Double-break Vacuum Interrupter and Dielectric Breakdown Probability Distribution in Vacuum Interrupter

    NASA Astrophysics Data System (ADS)

    Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi

    To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.

  20. Significance of stress transfer in time-dependent earthquake probability calculations

    USGS Publications Warehouse

    Parsons, T.

    2005-01-01

    A sudden change in stress is seen to modify earthquake rates, but should it also revise earthquake probability? Data used to derive input parameters permits an array of forecasts; so how large a static stress change is require to cause a statistically significant earthquake probability change? To answer that question, effects of parameter and philosophical choices are examined through all phases of sample calculations, Drawing at random from distributions of recurrence-aperiodicity pairs identifies many that recreate long paleoseismic and historic earthquake catalogs. Probability density funtions built from the recurrence-aperiodicity pairs give the range of possible earthquake forecasts under a point process renewal model. Consequences of choices made in stress transfer calculations, such as different slip models, fault rake, dip, and friction are, tracked. For interactions among large faults, calculated peak stress changes may be localized, with most of the receiving fault area changed less than the mean. Thus, to avoid overstating probability change on segments, stress change values should be drawn from a distribution reflecting the spatial pattern rather than using the segment mean. Disparity resulting from interaction probability methodology is also examined. For a fault with a well-understood earthquake history, a minimum stress change to stressing rate ratio of 10:1 to 20:1 is required to significantly skew probabilities with >80-85% confidence. That ratio must be closer to 50:1 to exceed 90-95% confidence levels. Thus revision to earthquake probability is achievable when a perturbing event is very close to the fault in question or the tectonic stressing rate is low.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Xin-Ping, E-mail: xuxp@mail.ihep.ac.cn; Ide, Yusuke

    In the literature, there are numerous studies of one-dimensional discrete-time quantum walks (DTQWs) using a moving shift operator. However, there is no exact solution for the limiting probability distributions of DTQWs on cycles using a general coin or swapping shift operator. In this paper, we derive exact solutions for the limiting probability distribution of quantum walks using a general coin and swapping shift operator on cycles for the first time. Based on the exact solutions, we show how to generate symmetric quantum walks and determine the condition under which a symmetric quantum walk appears. Our results suggest that choosing various coinmore » and initial state parameters can achieve a symmetric quantum walk. By defining a quantity to measure the variation of symmetry, deviation and mixing time of symmetric quantum walks are also investigated.« less

  2. Finding Bounded Rational Equilibria. Part 1; Iterative Focusing

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights from the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  3. Using a Betabinomial distribution to estimate the prevalence of adherence to physical activity guidelines among children and youth.

    PubMed

    Garriguet, Didier

    2016-04-01

    Estimates of the prevalence of adherence to physical activity guidelines in the population are generally the result of averaging individual probability of adherence based on the number of days people meet the guidelines and the number of days they are assessed. Given this number of active and inactive days (days assessed minus days active), the conditional probability of meeting the guidelines that has been used in the past is a Beta (1 + active days, 1 + inactive days) distribution assuming the probability p of a day being active is bounded by 0 and 1 and averages 50%. A change in the assumption about the distribution of p is required to better match the discrete nature of the data and to better assess the probability of adherence when the percentage of active days in the population differs from 50%. Using accelerometry data from the Canadian Health Measures Survey, the probability of adherence to physical activity guidelines is estimated using a conditional probability given the number of active and inactive days distributed as a Betabinomial(n, a + active days , β + inactive days) assuming that p is randomly distributed as Beta(a, β) where the parameters a and β are estimated by maximum likelihood. The resulting Betabinomial distribution is discrete. For children aged 6 or older, the probability of meeting physical activity guidelines 7 out of 7 days is similar to published estimates. For pre-schoolers, the Betabinomial distribution yields higher estimates of adherence to the guidelines than the Beta distribution, in line with the probability of being active on any given day. In estimating the probability of adherence to physical activity guidelines, the Betabinomial distribution has several advantages over the previously used Beta distribution. It is a discrete distribution and maximizes the richness of accelerometer data.

  4. Evolution of ion emission yield of alloys with the nature of the solute. 2: Interpretation

    NASA Technical Reports Server (NTRS)

    Blaise, G.; Slodzian, G.

    1977-01-01

    Solid solutions of transition elements in copper, nickel, cobalt, iron, and aluminum matrices were analyzed by observing secondary ion emissions under bombardment with 6.2-keV argon ions. Enchancement of the production of solute-element ions was observed. An ion emission model is proposed according to which the ion yield is governed by the probability of an atom leaving the metal in a preionized state. The energy distribution of the valence electrons of the solute atoms is the bases of the probability calculation.

  5. The influence of coarse-scale environmental features on current and predicted future distributions of narrow-range endemic crayfish populations

    USGS Publications Warehouse

    Dyer, Joseph J.; Brewer, Shannon K.; Worthington, Thomas A.; Bergey, Elizabeth A.

    2013-01-01

    1.A major limitation to effective management of narrow-range crayfish populations is the paucity of information on the spatial distribution of crayfish species and a general understanding of the interacting environmental variables that drive current and future potential distributional patterns. 2.Maximum Entropy Species Distribution Modeling Software (MaxEnt) was used to predict the current and future potential distributions of four endemic crayfish species in the Ouachita Mountains. Current distributions were modelled using climate, geology, soils, land use, landform and flow variables thought to be important to lotic crayfish. Potential changes in the distribution were forecast by using models trained on current conditions and projecting onto the landscape predicted under climate-change scenarios. 3.The modelled distribution of the four species closely resembled the perceived distribution of each species but also predicted populations in streams and catchments where they had not previously been collected. Soils, elevation and winter precipitation and temperature most strongly related to current distributions and represented 6587% of the predictive power of the models. Model accuracy was high for all models, and model predictions of new populations were verified through additional field sampling. 4.Current models created using two spatial resolutions (1 and 4.5km2) showed that fine-resolution data more accurately represented current distributions. For three of the four species, the 1-km2 resolution models resulted in more conservative predictions. However, the modelled distributional extent of Orconectes leptogonopodus was similar regardless of data resolution. Field validations indicated 1-km2 resolution models were more accurate than 4.5-km2 resolution models. 5.Future projected (4.5-km2 resolution models) model distributions indicated three of the four endemic species would have truncated ranges with low occurrence probabilities under the low-emission scenario, whereas two of four species would be severely restricted in range under moderatehigh emissions. Discrepancies in the two emission scenarios probably relate to the exclusion of behavioural adaptations from species-distribution models. 6.These model predictions illustrate possible impacts of climate change on narrow-range endemic crayfish populations. The predictions do not account for biotic interactions, migration, local habitat conditions or species adaptation. However, we identified the constraining landscape features acting on these populations that provide a framework for addressing habitat needs at a fine scale and developing targeted and systematic monitoring programmes.

  6. A framework for sensitivity analysis of decision trees.

    PubMed

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  7. Robust LOD scores for variance component-based linkage analysis.

    PubMed

    Blangero, J; Williams, J T; Almasy, L

    2000-01-01

    The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.

  8. Directional data analysis under the general projected normal distribution

    PubMed Central

    Wang, Fangpo; Gelfand, Alan E.

    2013-01-01

    The projected normal distribution is an under-utilized model for explaining directional data. In particular, the general version provides flexibility, e.g., asymmetry and possible bimodality along with convenient regression specification. Here, we clarify the properties of this general class. We also develop fully Bayesian hierarchical models for analyzing circular data using this class. We show how they can be fit using MCMC methods with suitable latent variables. We show how posterior inference for distributional features such as the angular mean direction and concentration can be implemented as well as how prediction within the regression setting can be handled. With regard to model comparison, we argue for an out-of-sample approach using both a predictive likelihood scoring loss criterion and a cumulative rank probability score criterion. PMID:24046539

  9. On the renewal risk model under a threshold strategy

    NASA Astrophysics Data System (ADS)

    Dong, Yinghui; Wang, Guojing; Yuen, Kam C.

    2009-08-01

    In this paper, we consider the renewal risk process under a threshold dividend payment strategy. For this model, the expected discounted dividend payments and the Gerber-Shiu expected discounted penalty function are investigated. Integral equations, integro-differential equations and some closed form expressions for them are derived. When the claims are exponentially distributed, it is verified that the expected penalty of the deficit at ruin is proportional to the ruin probability.

  10. Parkinson Disease Detection from Speech Articulation Neuromechanics.

    PubMed

    Gómez-Vilda, Pedro; Mekyska, Jiri; Ferrández, José M; Palacios-Alonso, Daniel; Gómez-Rodellar, Andrés; Rodellar-Biarge, Victoria; Galaz, Zoltan; Smekal, Zdenek; Eliasova, Ilona; Kostalova, Milena; Rektorova, Irena

    2017-01-01

    Aim: The research described is intended to give a description of articulation dynamics as a correlate of the kinematic behavior of the jaw-tongue biomechanical system, encoded as a probability distribution of an absolute joint velocity. This distribution may be used in detecting and grading speech from patients affected by neurodegenerative illnesses, as Parkinson Disease. Hypothesis: The work hypothesis is that the probability density function of the absolute joint velocity includes information on the stability of phonation when applied to sustained vowels, as well as on fluency if applied to connected speech. Methods: A dataset of sustained vowels recorded from Parkinson Disease patients is contrasted with similar recordings from normative subjects. The probability distribution of the absolute kinematic velocity of the jaw-tongue system is extracted from each utterance. A Random Least Squares Feed-Forward Network (RLSFN) has been used as a binary classifier working on the pathological and normative datasets in a leave-one-out strategy. Monte Carlo simulations have been conducted to estimate the influence of the stochastic nature of the classifier. Two datasets for each gender were tested (males and females) including 26 normative and 53 pathological subjects in the male set, and 25 normative and 38 pathological in the female set. Results: Male and female data subsets were tested in single runs, yielding equal error rates under 0.6% (Accuracy over 99.4%). Due to the stochastic nature of each experiment, Monte Carlo runs were conducted to test the reliability of the methodology. The average detection results after 200 Montecarlo runs of a 200 hyperplane hidden layer RLSFN are given in terms of Sensitivity (males: 0.9946, females: 0.9942), Specificity (males: 0.9944, females: 0.9941) and Accuracy (males: 0.9945, females: 0.9942). The area under the ROC curve is 0.9947 (males) and 0.9945 (females). The equal error rate is 0.0054 (males) and 0.0057 (females). Conclusions: The proposed methodology avails that the use of highly normalized descriptors as the probability distribution of kinematic variables of vowel articulation stability, which has some interesting properties in terms of information theory, boosts the potential of simple yet powerful classifiers in producing quite acceptable detection results in Parkinson Disease.

  11. Density distribution function of a self-gravitating isothermal compressible turbulent fluid in the context of molecular clouds ensembles

    NASA Astrophysics Data System (ADS)

    Donkov, Sava; Stefanov, Ivan Z.

    2018-03-01

    We have set ourselves the task of obtaining the probability distribution function of the mass density of a self-gravitating isothermal compressible turbulent fluid from its physics. We have done this in the context of a new notion: the molecular clouds ensemble. We have applied a new approach that takes into account the fractal nature of the fluid. Using the medium equations, under the assumption of steady state, we show that the total energy per unit mass is an invariant with respect to the fractal scales. As a next step we obtain a non-linear integral equation for the dimensionless scale Q which is the third root of the integral of the probability distribution function. It is solved approximately up to the leading-order term in the series expansion. We obtain two solutions. They are power-law distributions with different slopes: the first one is -1.5 at low densities, corresponding to an equilibrium between all energies at a given scale, and the second one is -2 at high densities, corresponding to a free fall at small scales.

  12. Derived distribution of floods based on the concept of partial area coverage with a climatic appeal

    NASA Astrophysics Data System (ADS)

    Iacobellis, Vito; Fiorentino, Mauro

    2000-02-01

    A new rationale for deriving the probability distribution of floods and help in understanding the physical processes underlying the distribution itself is presented. On the basis of this a model that presents a number of new assumptions is developed. The basic ideas are as follows: (1) The peak direct streamflow Q can always be expressed as the product of two random variates, namely, the average runoff per unit area ua and the peak contributing area a; (2) the distribution of ua conditional on a can be related to that of the rainfall depth occurring in a duration equal to a characteristic response time тa of the contributing part of the basin; and (3) тa is assumed to vary with a according to a power law. Consequently, the probability density function of Q can be found as the integral, over the total basin area A of that of a times the density function of ua given a. It is suggested that ua can be expressed as a fraction of the excess rainfall and that the annual flood distribution can be related to that of Q by the hypothesis that the flood occurrence process is Poissonian. In the proposed model it is assumed, as an exploratory attempt, that a and ua are gamma and Weibull distributed, respectively. The model was applied to the annual flood series of eight gauged basins in Basilicata (southern Italy) with catchment areas ranging from 40 to 1600 km2. The results showed strong physical consistence as the parameters tended to assume values in good agreement with well-consolidated geomorphologic knowledge and suggested a new key to understanding the climatic control of the probability distribution of floods.

  13. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  14. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  15. Random Partition Distribution Indexed by Pairwise Information

    PubMed Central

    Dahl, David B.; Day, Ryan; Tsai, Jerry W.

    2017-01-01

    We propose a random partition distribution indexed by pairwise similarity information such that partitions compatible with the similarities are given more probability. The use of pairwise similarities, in the form of distances, is common in some clustering algorithms (e.g., hierarchical clustering), but we show how to use this type of information to define a prior partition distribution for flexible Bayesian modeling. A defining feature of the distribution is that it allocates probability among partitions within a given number of subsets, but it does not shift probability among sets of partitions with different numbers of subsets. Our distribution places more probability on partitions that group similar items yet keeps the total probability of partitions with a given number of subsets constant. The distribution of the number of subsets (and its moments) is available in closed-form and is not a function of the similarities. Our formulation has an explicit probability mass function (with a tractable normalizing constant) so the full suite of MCMC methods may be used for posterior inference. We compare our distribution with several existing partition distributions, showing that our formulation has attractive properties. We provide three demonstrations to highlight the features and relative performance of our distribution. PMID:29276318

  16. Dual assimilation of satellite soil moisture to improve flood prediction in ungauged catchments

    USDA-ARS?s Scientific Manuscript database

    This paper explores the use of active and passive satellite soil moisture products for improving stream flow prediction within 4 large (>5,000km2) semi-arid catchments. We use the probability distributed model (PDM) under a data-scarce scenario and aim at correcting two key controlling factors in th...

  17. 14 CFR 25.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... certification or under operating rules and that requires a power supply is an “essential load” on the power supply. The power sources and the system must be able to supply the following power loads in probable... source of power is required, after any failure or malfunction in any one power supply system...

  18. 14 CFR 25.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... certification or under operating rules and that requires a power supply is an “essential load” on the power supply. The power sources and the system must be able to supply the following power loads in probable... source of power is required, after any failure or malfunction in any one power supply system...

  19. Differential Effects of Insular and Ventromedial Prefrontal Cortex Lesions on Risky Decision-Making

    ERIC Educational Resources Information Center

    Clark, L.; Bechara, A.; Damasio, H.; Aitken, M. R. F.; Sahakian, B. J.; Robbins, T. W.

    2008-01-01

    The ventromedial prefrontal cortex (vmPFC) and insular cortex are implicated in distributed neural circuitry that supports emotional decision-making. Previous studies of patients with vmPFC lesions have focused primarily on decision-making under uncertainty, when outcome probabilities are ambiguous (e.g. the Iowa Gambling Task). It remains unclear…

  20. Optimized lower leg injury probability curves from postmortem human subject tests under axial impacts.

    PubMed

    Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko

    2014-01-01

    Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 kN at 25% risk; and 10.4, 8.3, and 6.6 kN at 50% risk, respectively. This study derived axial loading-induced injury risk curves based on survival analysis using peak force and specimen age; adopting different censoring schemes; considering overly influential samples in the analysis; and assessing the quality of the distribution at discrete probability levels. Because procedures used in the present survival analysis are accepted by international automotive communities, current optimum human injury probability distributions can be used at all risk levels with more confidence in future crashworthiness applications for automotive and other disciplines.

  1. Surface drift prediction in the Adriatic Sea using hyper-ensemble statistics on atmospheric, ocean and wave models: Uncertainties and probability distribution areas

    USGS Publications Warehouse

    Rixen, M.; Ferreira-Coelho, E.; Signell, R.

    2008-01-01

    Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).

  2. Markov Chain Monte Carlo estimation of species distributions: a case study of the swift fox in western Kansas

    USGS Publications Warehouse

    Sargeant, Glen A.; Sovada, Marsha A.; Slivinski, Christiane C.; Johnson, Douglas H.

    2005-01-01

    Accurate maps of species distributions are essential tools for wildlife research and conservation. Unfortunately, biologists often are forced to rely on maps derived from observed occurrences recorded opportunistically during observation periods of variable length. Spurious inferences are likely to result because such maps are profoundly affected by the duration and intensity of observation and by methods used to delineate distributions, especially when detection is uncertain. We conducted a systematic survey of swift fox (Vulpes velox) distribution in western Kansas, USA, and used Markov chain Monte Carlo (MCMC) image restoration to rectify these problems. During 1997–1999, we searched 355 townships (ca. 93 km) 1–3 times each for an average cost of $7,315 per year and achieved a detection rate (probability of detecting swift foxes, if present, during a single search) of = 0.69 (95% Bayesian confidence interval [BCI] = [0.60, 0.77]). Our analysis produced an estimate of the underlying distribution, rather than a map of observed occurrences, that reflected the uncertainty associated with estimates of model parameters. To evaluate our results, we analyzed simulated data with similar properties. Results of our simulations suggest negligible bias and good precision when probabilities of detection on ≥1 survey occasions (cumulative probabilities of detection) exceed 0.65. Although the use of MCMC image restoration has been limited by theoretical and computational complexities, alternatives do not possess the same advantages. Image models accommodate uncertain detection, do not require spatially independent data or a census of map units, and can be used to estimate species distributions directly from observations without relying on habitat covariates or parameters that must be estimated subjectively. These features facilitate economical surveys of large regions, the detection of temporal trends in distribution, and assessments of landscape-level relations between species and habitats. Requirements for the use of MCMC image restoration include study areas that can be partitioned into regular grids of mapping units, spatially contagious species distributions, reliable methods for identifying target species, and cumulative probabilities of detection ≥0.65.

  3. Markov chain Monte Carlo estimation of species distributions: A case study of the swift fox in western Kansas

    USGS Publications Warehouse

    Sargeant, G.A.; Sovada, M.A.; Slivinski, C.C.; Johnson, D.H.

    2005-01-01

    Accurate maps of species distributions are essential tools for wildlife research and conservation. Unfortunately, biologists often are forced to rely on maps derived from observed occurrences recorded opportunistically during observation periods of variable length. Spurious inferences are likely to result because such maps are profoundly affected by the duration and intensity of observation and by methods used to delineate distributions, especially when detection is uncertain. We conducted a systematic survey of swift fox (Vulpes velox) distribution in western Kansas, USA, and used Markov chain Monte Carlo (MCMC) image restoration to rectify these problems. During 1997-1999, we searched 355 townships (ca. 93 km2) 1-3 times each for an average cost of $7,315 per year and achieved a detection rate (probability of detecting swift foxes, if present, during a single search) of ?? = 0.69 (95% Bayesian confidence interval [BCI] = [0.60, 0.77]). Our analysis produced an estimate of the underlying distribution, rather than a map of observed occurrences, that reflected the uncertainty associated with estimates of model parameters. To evaluate our results, we analyzed simulated data with similar properties. Results of our simulations suggest negligible bias and good precision when probabilities of detection on ???1 survey occasions (cumulative probabilities of detection) exceed 0.65. Although the use of MCMC image restoration has been limited by theoretical and computational complexities, alternatives do not possess the same advantages. Image models accommodate uncertain detection, do not require spatially independent data or a census of map units, and can be used to estimate species distributions directly from observations without relying on habitat covariates or parameters that must be estimated subjectively. These features facilitate economical surveys of large regions, the detection of temporal trends in distribution, and assessments of landscape-level relations between species and habitats. Requirements for the use of MCMC image restoration include study areas that can be partitioned into regular grids of mapping units, spatially contagious species distributions, reliable methods for identifying target species, and cumulative probabilities of detection ???0.65.

  4. Aggregate and individual replication probability within an explicit model of the research process.

    PubMed

    Miller, Jeff; Schwarz, Wolf

    2011-09-01

    We study a model of the research process in which the true effect size, the replication jitter due to changes in experimental procedure, and the statistical error of effect size measurement are all normally distributed random variables. Within this model, we analyze the probability of successfully replicating an initial experimental result by obtaining either a statistically significant result in the same direction or any effect in that direction. We analyze both the probability of successfully replicating a particular experimental effect (i.e., the individual replication probability) and the average probability of successful replication across different studies within some research context (i.e., the aggregate replication probability), and we identify the conditions under which the latter can be approximated using the formulas of Killeen (2005a, 2007). We show how both of these probabilities depend on parameters of the research context that would rarely be known in practice. In addition, we show that the statistical uncertainty associated with the size of an initial observed effect would often prevent accurate estimation of the desired individual replication probability even if these research context parameters were known exactly. We conclude that accurate estimates of replication probability are generally unattainable.

  5. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  6. Activated recombinative desorption: A potential component in mechanisms of spacecraft glow

    NASA Technical Reports Server (NTRS)

    Cross, J. B.

    1985-01-01

    The concept of activated recombination of atomic species on surfaces can explain the production of vibrationally and translationally excited desorbed molecular species. Equilibrium statistical mechanics predicts that the molecular quantum state distributions of desorbing molecules is a function of surface temperature only when the adsorption probability is unity and independent of initial collision conditions. In most cases, the adsorption probability is dependent upon initial conditions such as collision energy or internal quantum state distribution of impinging molecules. From detailed balance, such dynamical behavior is reflected in the internal quantum state distribution of the desorbing molecule. This concept, activated recombinative desorption, may offer a common thread in proposed mechanisms of spacecraft glow. Using molecular beam techniques and equipment available at Los Alamos, which includes a high translational energy 0-atom beam source, mass spectrometric detection of desorbed species, chemiluminescence/laser induced fluorescence detection of electronic and vibrationally excited reaction products, and Auger detection of surface adsorbed reaction products, a fundamental study of the gas surface chemistry underlying the glow process is proposed.

  7. Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar.

    PubMed

    Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le

    2016-09-09

    Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar's estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method.

  8. Dynamics in atomic signaling games.

    PubMed

    Fox, Michael J; Touri, Behrouz; Shamma, Jeff S

    2015-07-07

    We study an atomic signaling game under stochastic evolutionary dynamics. There are a finite number of players who repeatedly update from a finite number of available languages/signaling strategies. Players imitate the most fit agents with high probability or mutate with low probability. We analyze the long-run distribution of states and show that, for sufficiently small mutation probability, its support is limited to efficient communication systems. We find that this behavior is insensitive to the particular choice of evolutionary dynamic, a property that is due to the game having a potential structure with a potential function corresponding to average fitness. Consequently, the model supports conclusions similar to those found in the literature on language competition. That is, we show that efficient languages eventually predominate the society while reproducing the empirical phenomenon of linguistic drift. The emergence of efficiency in the atomic case can be contrasted with results for non-atomic signaling games that establish the non-negligible possibility of convergence, under replicator dynamics, to states of unbounded efficiency loss. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Dynamic properties of molecular motors in burnt-bridge models

    NASA Astrophysics Data System (ADS)

    Artyomov, Maxim N.; Morozov, Alexander Yu; Pronina, Ekaterina; Kolomeisky, Anatoly B.

    2007-08-01

    Dynamic properties of molecular motors that fuel their motion by actively interacting with underlying molecular tracks are studied theoretically via discrete-state stochastic 'burnt-bridge' models. The transport of the particles is viewed as an effective diffusion along one-dimensional lattices with periodically distributed weak links. When an unbiased random walker passes the weak link it can be destroyed ('burned') with probability p, providing a bias in the motion of the molecular motor. We present a theoretical approach that allows one to calculate exactly all dynamic properties of motor proteins, such as velocity and dispersion, under general conditions. It is found that dispersion is a decreasing function of the concentration of bridges, while the dependence of dispersion on the burning probability is more complex. Our calculations also show a gap in dispersion for very low concentrations of weak links or for very low burning probabilities which indicates a dynamic phase transition between unbiased and biased diffusion regimes. Theoretical findings are supported by Monte Carlo computer simulations.

  10. MaxEnt alternatives to pearson family distributions

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie J.

    2012-05-01

    In a previous MaxEnt conference [11] a method of obtaining MaxEnt univariate distributions under a variety of constraints was presented. The Mathematica function Interpolation[], normally used with numerical data, can also process "semi-symbolic" data, and Lagrange Multiplier equations were solved for a set of symbolic ordinates describing the required MaxEnt probability density function. We apply a more developed version of this approach to finding MaxEnt distributions having prescribed β1 and β2 values, and compare the entropy of the MaxEnt distribution to that of the Pearson family distribution having the same β1 and β2. These MaxEnt distributions do have, in general, greater entropy than the related Pearson distribution. In accordance with Jaynes' Maximum Entropy Principle, these MaxEnt distributions are thus to be preferred to the corresponding Pearson distributions as priors in Bayes' Theorem.

  11. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    PubMed

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  12. Exploration of Use of Copulas in Analysing the Relationship between Precipitation and Meteorological Drought in Beijing, China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, Linlin; Wang, Hongrui; Wang, Cheng

    Drought risk analysis is essential for regional water resource management. In this study, the probabilistic relationship between precipitation and meteorological drought in Beijing, China, was calculated under three different precipitation conditions (precipitation equal to, greater than, or less than a threshold) based on copulas. The Standardized Precipitation Evapotranspiration Index (SPEI) was calculated based on monthly total precipitation and monthly mean temperature data. The trends and variations in the SPEI were analysed using Hilbert-Huang Transform (HHT) and Mann-Kendall (MK) trend tests with a running approach. The results of the HHT and MK test indicated a significant decreasing trend in the SPEI.more » The copula-based conditional probability indicated that the probability of meteorological drought decreased as monthly precipitation increased and that 10 mm can be regarded as the threshold for triggering extreme drought. From a quantitative perspective, when R ≤ mm, the probabilities of moderate drought, severe drought, and extreme drought were 22.1%, 18%, and 13.6%, respectively. This conditional probability distribution not only revealed the occurrence of meteorological drought in Beijing but also provided a quantitative way to analyse the probability of drought under different precipitation conditions. Furthermore, the results provide a useful reference for future drought prediction.« less

  13. Exploration of Use of Copulas in Analysing the Relationship between Precipitation and Meteorological Drought in Beijing, China

    DOE PAGES

    Fan, Linlin; Wang, Hongrui; Wang, Cheng; ...

    2017-05-16

    Drought risk analysis is essential for regional water resource management. In this study, the probabilistic relationship between precipitation and meteorological drought in Beijing, China, was calculated under three different precipitation conditions (precipitation equal to, greater than, or less than a threshold) based on copulas. The Standardized Precipitation Evapotranspiration Index (SPEI) was calculated based on monthly total precipitation and monthly mean temperature data. The trends and variations in the SPEI were analysed using Hilbert-Huang Transform (HHT) and Mann-Kendall (MK) trend tests with a running approach. The results of the HHT and MK test indicated a significant decreasing trend in the SPEI.more » The copula-based conditional probability indicated that the probability of meteorological drought decreased as monthly precipitation increased and that 10 mm can be regarded as the threshold for triggering extreme drought. From a quantitative perspective, when R ≤ mm, the probabilities of moderate drought, severe drought, and extreme drought were 22.1%, 18%, and 13.6%, respectively. This conditional probability distribution not only revealed the occurrence of meteorological drought in Beijing but also provided a quantitative way to analyse the probability of drought under different precipitation conditions. Furthermore, the results provide a useful reference for future drought prediction.« less

  14. The global impact distribution of Near-Earth objects

    NASA Astrophysics Data System (ADS)

    Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.

    2016-02-01

    Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.

  15. Finding Bounded Rational Equilibria. Part 2; Alternative Lagrangians and Uncountable Move Spaces

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights &om the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  16. Spatial estimation from remotely sensed data via empirical Bayes models

    NASA Technical Reports Server (NTRS)

    Hill, J. R.; Hinkley, D. V.; Kostal, H.; Morris, C. N.

    1984-01-01

    Multichannel satellite image data, available as LANDSAT imagery, are recorded as a multivariate time series (four channels, multiple passovers) in two spatial dimensions. The application of parametric empirical Bayes theory to classification of, and estimating the probability of, each crop type at each of a large number of pixels is considered. This theory involves both the probability distribution of imagery data, conditional on crop types, and the prior spatial distribution of crop types. For the latter Markov models indexed by estimable parameters are used. A broad outline of the general theory reveals several questions for further research. Some detailed results are given for the special case of two crop types when only a line transect is analyzed. Finally, the estimation of an underlying continuous process on the lattice is discussed which would be applicable to such quantities as crop yield.

  17. A global logrank test for adaptive treatment strategies based on observational studies.

    PubMed

    Li, Zhiguo; Valenstein, Marcia; Pfeiffer, Paul; Ganoczy, Dara

    2014-02-28

    In studying adaptive treatment strategies, a natural question that is of paramount interest is whether there is any significant difference among all possible treatment strategies. When the outcome variable of interest is time-to-event, we propose an inverse probability weighted logrank test for testing the equivalence of a fixed set of pre-specified adaptive treatment strategies based on data from an observational study. The weights take into account both the possible selection bias in an observational study and the fact that the same subject may be consistent with more than one treatment strategy. The asymptotic distribution of the weighted logrank statistic under the null hypothesis is obtained. We show that, in an observational study where the treatment selection probabilities need to be estimated, the estimation of these probabilities does not have an effect on the asymptotic distribution of the weighted logrank statistic, as long as the estimation of the parameters in the models for these probabilities is n-consistent. Finite sample performance of the test is assessed via a simulation study. We also show in the simulation that the test can be pretty robust to misspecification of the models for the probabilities of treatment selection. The method is applied to analyze data on antidepressant adherence time from an observational database maintained at the Department of Veterans Affairs' Serious Mental Illness Treatment Research and Evaluation Center. Copyright © 2013 John Wiley & Sons, Ltd.

  18. Multilevel sequential Monte Carlo samplers

    DOE PAGES

    Beskos, Alexandros; Jasra, Ajay; Law, Kody; ...

    2016-08-24

    Here, we study the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods and leading to a discretisation bias, with the step-size level h L. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretisation levelsmore » $${\\infty}$$ >h 0>h 1 ...>h L. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence of probability distributions. A sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. In conclusion, it is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context.« less

  19. Iterative updating of model error for Bayesian inversion

    NASA Astrophysics Data System (ADS)

    Calvetti, Daniela; Dunlop, Matthew; Somersalo, Erkki; Stuart, Andrew

    2018-02-01

    In computational inverse problems, it is common that a detailed and accurate forward model is approximated by a computationally less challenging substitute. The model reduction may be necessary to meet constraints in computing time when optimization algorithms are used to find a single estimate, or to speed up Markov chain Monte Carlo (MCMC) calculations in the Bayesian framework. The use of an approximate model introduces a discrepancy, or modeling error, that may have a detrimental effect on the solution of the ill-posed inverse problem, or it may severely distort the estimate of the posterior distribution. In the Bayesian paradigm, the modeling error can be considered as a random variable, and by using an estimate of the probability distribution of the unknown, one may estimate the probability distribution of the modeling error and incorporate it into the inversion. We introduce an algorithm which iterates this idea to update the distribution of the model error, leading to a sequence of posterior distributions that are demonstrated empirically to capture the underlying truth with increasing accuracy. Since the algorithm is not based on rejections, it requires only limited full model evaluations. We show analytically that, in the linear Gaussian case, the algorithm converges geometrically fast with respect to the number of iterations when the data is finite dimensional. For more general models, we introduce particle approximations of the iteratively generated sequence of distributions; we also prove that each element of the sequence converges in the large particle limit under a simplifying assumption. We show numerically that, as in the linear case, rapid convergence occurs with respect to the number of iterations. Additionally, we show through computed examples that point estimates obtained from this iterative algorithm are superior to those obtained by neglecting the model error.

  20. Detection of mastitis in dairy cattle by use of mixture models for repeated somatic cell scores: a Bayesian approach via Gibbs sampling.

    PubMed

    Odegård, J; Jensen, J; Madsen, P; Gianola, D; Klemetsdal, G; Heringstad, B

    2003-11-01

    The distribution of somatic cell scores could be regarded as a mixture of at least two components depending on a cow's udder health status. A heteroscedastic two-component Bayesian normal mixture model with random effects was developed and implemented via Gibbs sampling. The model was evaluated using datasets consisting of simulated somatic cell score records. Somatic cell score was simulated as a mixture representing two alternative udder health statuses ("healthy" or "diseased"). Animals were assigned randomly to the two components according to the probability of group membership (Pm). Random effects (additive genetic and permanent environment), when included, had identical distributions across mixture components. Posterior probabilities of putative mastitis were estimated for all observations, and model adequacy was evaluated using measures of sensitivity, specificity, and posterior probability of misclassification. Fitting different residual variances in the two mixture components caused some bias in estimation of parameters. When the components were difficult to disentangle, so were their residual variances, causing bias in estimation of Pm and of location parameters of the two underlying distributions. When all variance components were identical across mixture components, the mixture model analyses returned parameter estimates essentially without bias and with a high degree of precision. Including random effects in the model increased the probability of correct classification substantially. No sizable differences in probability of correct classification were found between models in which a single cow effect (ignoring relationships) was fitted and models where this effect was split into genetic and permanent environmental components, utilizing relationship information. When genetic and permanent environmental effects were fitted, the between-replicate variance of estimates of posterior means was smaller because the model accounted for random genetic drift.

  1. Modelling the spatial distribution of Fasciola hepatica in dairy cattle in Europe.

    PubMed

    Ducheyne, Els; Charlier, Johannes; Vercruysse, Jozef; Rinaldi, Laura; Biggeri, Annibale; Demeler, Janina; Brandt, Christina; De Waal, Theo; Selemetas, Nikolaos; Höglund, Johan; Kaba, Jaroslaw; Kowalczyk, Slawomir J; Hendrickx, Guy

    2015-03-26

    A harmonized sampling approach in combination with spatial modelling is required to update current knowledge of fasciolosis in dairy cattle in Europe. Within the scope of the EU project GLOWORM, samples from 3,359 randomly selected farms in 849 municipalities in Belgium, Germany, Ireland, Poland and Sweden were collected and their infection status assessed using an indirect bulk tank milk (BTM) enzyme-linked immunosorbent assay (ELISA). Dairy farms were considered exposed when the optical density ratio (ODR) exceeded the 0.3 cut-off. Two ensemble-modelling techniques, Random Forests (RF) and Boosted Regression Trees (BRT), were used to obtain the spatial distribution of the probability of exposure to Fasciola hepatica using remotely sensed environmental variables (1-km spatial resolution) and interpolated values from meteorological stations as predictors. The median ODRs amounted to 0.31, 0.12, 0.54, 0.25 and 0.44 for Belgium, Germany, Ireland, Poland and southern Sweden, respectively. Using the 0.3 threshold, 571 municipalities were categorized as positive and 429 as negative. RF was seen as capable of predicting the spatial distribution of exposure with an area under the receiver operation characteristic (ROC) curve (AUC) of 0.83 (0.96 for BRT). Both models identified rainfall and temperature as the most important factors for probability of exposure. Areas of high and low exposure were identified by both models, with BRT better at discriminating between low-probability and high-probability exposure; this model may therefore be more useful in practise. Given a harmonized sampling strategy, it should be possible to generate robust spatial models for fasciolosis in dairy cattle in Europe to be used as input for temporal models and for the detection of deviations in baseline probability. Further research is required for model output in areas outside the eco-climatic range investigated.

  2. Temperate Mountain Forest Biodiversity under Climate Change: Compensating Negative Effects by Increasing Structural Complexity

    PubMed Central

    Braunisch, Veronika; Coppes, Joy; Arlettaz, Raphaël; Suchant, Rudi; Zellweger, Florian; Bollmann, Kurt

    2014-01-01

    Species adapted to cold-climatic mountain environments are expected to face a high risk of range contractions, if not local extinctions under climate change. Yet, the populations of many endothermic species may not be primarily affected by physiological constraints, but indirectly by climate-induced changes of habitat characteristics. In mountain forests, where vertebrate species largely depend on vegetation composition and structure, deteriorating habitat suitability may thus be mitigated or even compensated by habitat management aiming at compositional and structural enhancement. We tested this possibility using four cold-adapted bird species with complementary habitat requirements as model organisms. Based on species data and environmental information collected in 300 1-km2 grid cells distributed across four mountain ranges in central Europe, we investigated (1) how species’ occurrence is explained by climate, landscape, and vegetation, (2) to what extent climate change and climate-induced vegetation changes will affect habitat suitability, and (3) whether these changes could be compensated by adaptive habitat management. Species presence was modelled as a function of climate, landscape and vegetation variables under current climate; moreover, vegetation-climate relationships were assessed. The models were extrapolated to the climatic conditions of 2050, assuming the moderate IPCC-scenario A1B, and changes in species’ occurrence probability were quantified. Finally, we assessed the maximum increase in occurrence probability that could be achieved by modifying one or multiple vegetation variables under altered climate conditions. Climate variables contributed significantly to explaining species occurrence, and expected climatic changes, as well as climate-induced vegetation trends, decreased the occurrence probability of all four species, particularly at the low-altitudinal margins of their distribution. These effects could be partly compensated by modifying single vegetation factors, but full compensation would only be achieved if several factors were changed in concert. The results illustrate the possibilities and limitations of adaptive species conservation management under climate change. PMID:24823495

  3. Temperate mountain forest biodiversity under climate change: compensating negative effects by increasing structural complexity.

    PubMed

    Braunisch, Veronika; Coppes, Joy; Arlettaz, Raphaël; Suchant, Rudi; Zellweger, Florian; Bollmann, Kurt

    2014-01-01

    Species adapted to cold-climatic mountain environments are expected to face a high risk of range contractions, if not local extinctions under climate change. Yet, the populations of many endothermic species may not be primarily affected by physiological constraints, but indirectly by climate-induced changes of habitat characteristics. In mountain forests, where vertebrate species largely depend on vegetation composition and structure, deteriorating habitat suitability may thus be mitigated or even compensated by habitat management aiming at compositional and structural enhancement. We tested this possibility using four cold-adapted bird species with complementary habitat requirements as model organisms. Based on species data and environmental information collected in 300 1-km2 grid cells distributed across four mountain ranges in central Europe, we investigated (1) how species' occurrence is explained by climate, landscape, and vegetation, (2) to what extent climate change and climate-induced vegetation changes will affect habitat suitability, and (3) whether these changes could be compensated by adaptive habitat management. Species presence was modelled as a function of climate, landscape and vegetation variables under current climate; moreover, vegetation-climate relationships were assessed. The models were extrapolated to the climatic conditions of 2050, assuming the moderate IPCC-scenario A1B, and changes in species' occurrence probability were quantified. Finally, we assessed the maximum increase in occurrence probability that could be achieved by modifying one or multiple vegetation variables under altered climate conditions. Climate variables contributed significantly to explaining species occurrence, and expected climatic changes, as well as climate-induced vegetation trends, decreased the occurrence probability of all four species, particularly at the low-altitudinal margins of their distribution. These effects could be partly compensated by modifying single vegetation factors, but full compensation would only be achieved if several factors were changed in concert. The results illustrate the possibilities and limitations of adaptive species conservation management under climate change.

  4. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    USGS Publications Warehouse

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which maximizes the use of available resources. Increased implementation of approaches that consider detection error promote ecological advancements and conservation and management decisions that are better informed.

  5. Quasi-Bell inequalities from symmetrized products of noncommuting qubit observables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamel, Omar E.; Fleming, Graham R.

    Noncommuting observables cannot be simultaneously measured; however, under local hidden variable models, they must simultaneously hold premeasurement values, implying the existence of a joint probability distribution. We study the joint distributions of noncommuting observables on qubits, with possible criteria of positivity and the Fréchet bounds limiting the joint probabilities, concluding that the latter may be negative. We use symmetrization, justified heuristically and then more carefully via the Moyal characteristic function, to find the quantum operator corresponding to the product of noncommuting observables. This is then used to construct Quasi-Bell inequalities, Bell inequalities containing products of noncommuting observables, on two qubits.more » These inequalities place limits on the local hidden variable models that define joint probabilities for noncommuting observables. We also found that the Quasi-Bell inequalities have a quantum to classical violation as high as 3/2 on two qubit, higher than conventional Bell inequalities. Our result demonstrates the theoretical importance of noncommutativity in the nonlocality of quantum mechanics and provides an insightful generalization of Bell inequalities.« less

  6. Quasi-Bell inequalities from symmetrized products of noncommuting qubit observables

    DOE PAGES

    Gamel, Omar E.; Fleming, Graham R.

    2017-05-01

    Noncommuting observables cannot be simultaneously measured; however, under local hidden variable models, they must simultaneously hold premeasurement values, implying the existence of a joint probability distribution. We study the joint distributions of noncommuting observables on qubits, with possible criteria of positivity and the Fréchet bounds limiting the joint probabilities, concluding that the latter may be negative. We use symmetrization, justified heuristically and then more carefully via the Moyal characteristic function, to find the quantum operator corresponding to the product of noncommuting observables. This is then used to construct Quasi-Bell inequalities, Bell inequalities containing products of noncommuting observables, on two qubits.more » These inequalities place limits on the local hidden variable models that define joint probabilities for noncommuting observables. We also found that the Quasi-Bell inequalities have a quantum to classical violation as high as 3/2 on two qubit, higher than conventional Bell inequalities. Our result demonstrates the theoretical importance of noncommutativity in the nonlocality of quantum mechanics and provides an insightful generalization of Bell inequalities.« less

  7. Isotropic probability measures in infinite-dimensional spaces

    NASA Technical Reports Server (NTRS)

    Backus, George

    1987-01-01

    Let R be the real numbers, R(n) the linear space of all real n-tuples, and R(infinity) the linear space of all infinite real sequences x = (x sub 1, x sub 2,...). Let P sub in :R(infinity) approaches R(n) be the projection operator with P sub n (x) = (x sub 1,...,x sub n). Let p(infinity) be a probability measure on the smallest sigma-ring of subsets of R(infinity) which includes all of the cylinder sets P sub n(-1) (B sub n), where B sub n is an arbitrary Borel subset of R(n). Let p sub n be the marginal distribution of p(infinity) on R(n), so p sub n(B sub n) = p(infinity) (P sub n to the -1 (B sub n)) for each B sub n. A measure on R(n) is isotropic if it is invariant under all orthogonal transformations of R(n). All members of the set of all isotropic probability distributions on R(n) are described. The result calls into question both stochastic inversion and Bayesian inference, as currently used in many geophysical inverse problems.

  8. Testing typicality in multiverse cosmology

    NASA Astrophysics Data System (ADS)

    Azhar, Feraz

    2015-05-01

    In extracting predictions from theories that describe a multiverse, we face the difficulty that we must assess probability distributions over possible observations prescribed not just by an underlying theory, but by a theory together with a conditionalization scheme that allows for (anthropic) selection effects. This means we usually need to compare distributions that are consistent with a broad range of possible observations with actual experimental data. One controversial means of making this comparison is by invoking the "principle of mediocrity": that is, the principle that we are typical of the reference class implicit in the conjunction of the theory and the conditionalization scheme. In this paper, we quantitatively assess the principle of mediocrity in a range of cosmological settings, employing "xerographic distributions" to impose a variety of assumptions regarding typicality. We find that for a fixed theory, the assumption that we are typical gives rise to higher likelihoods for our observations. If, however, one allows both the underlying theory and the assumption of typicality to vary, then the assumption of typicality does not always provide the highest likelihoods. Interpreted from a Bayesian perspective, these results support the claim that when one has the freedom to consider different combinations of theories and xerographic distributions (or different "frameworks"), one should favor the framework that has the highest posterior probability; and then from this framework one can infer, in particular, how typical we are. In this way, the invocation of the principle of mediocrity is more questionable than has been recently claimed.

  9. Evaluation of Gas Phase Dispersion in Flotation under Predetermined Hydrodynamic Conditions

    NASA Astrophysics Data System (ADS)

    Młynarczykowska, Anna; Oleksik, Konrad; Tupek-Murowany, Klaudia

    2018-03-01

    Results of various investigations shows the relationship between the flotation parameters and gas distribution in a flotation cell. The size of gas bubbles is a random variable with a specific distribution. The analysis of this distribution is useful to make mathematical description of the flotation process. The flotation process depends on many variable factors. These are mainly occurrences like collision of single particle with gas bubble, adhesion of particle to the surface of bubble and detachment process. These factors are characterized by randomness. Because of that it is only possible to talk about the probability of occurence of one of these events which directly affects the speed of the process, thus a constant speed of flotation process. Probability of the bubble-particle collision in the flotation chamber with mechanical pulp agitation depends on the surface tension of the solution, air consumption, degree of pul aeration, energy dissipation and average feed particle size. Appropriate identification and description of the parameters of the dispersion of gas bubbles helps to complete the analysis of the flotation process in a specific physicochemical conditions and hydrodynamic for any raw material. The article presents the results of measurements and analysis of the gas phase dispersion by the size distribution of air bubbles in a flotation chamber under fixed hydrodynamic conditions. The tests were carried out in the Laboratory of Instrumental Methods in Department of Environmental Engineering and Mineral Processing, Faculty of Mining and Geoengineerin, AGH Univeristy of Science and Technology in Krakow.

  10. Identifying the rooted species tree from the distribution of unrooted gene trees under the coalescent.

    PubMed

    Allman, Elizabeth S; Degnan, James H; Rhodes, John A

    2011-06-01

    Gene trees are evolutionary trees representing the ancestry of genes sampled from multiple populations. Species trees represent populations of individuals-each with many genes-splitting into new populations or species. The coalescent process, which models ancestry of gene copies within populations, is often used to model the probability distribution of gene trees given a fixed species tree. This multispecies coalescent model provides a framework for phylogeneticists to infer species trees from gene trees using maximum likelihood or Bayesian approaches. Because the coalescent models a branching process over time, all trees are typically assumed to be rooted in this setting. Often, however, gene trees inferred by traditional phylogenetic methods are unrooted. We investigate probabilities of unrooted gene trees under the multispecies coalescent model. We show that when there are four species with one gene sampled per species, the distribution of unrooted gene tree topologies identifies the unrooted species tree topology and some, but not all, information in the species tree edges (branch lengths). The location of the root on the species tree is not identifiable in this situation. However, for 5 or more species with one gene sampled per species, we show that the distribution of unrooted gene tree topologies identifies the rooted species tree topology and all its internal branch lengths. The length of any pendant branch leading to a leaf of the species tree is also identifiable for any species from which more than one gene is sampled.

  11. Probability theory versus simulation of petroleum potential in play analysis

    USGS Publications Warehouse

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  12. Regional analysis and derivation of copula-based drought Severity-Area-Frequency curve in Lake Urmia basin, Iran.

    PubMed

    Amirataee, Babak; Montaseri, Majid; Rezaie, Hossein

    2018-01-15

    Droughts are extreme events characterized by temporal duration and spatial large-scale effects. In general, regional droughts are affected by general circulation of the atmosphere (at large-scale) and regional natural factors, including the topography, natural lakes, the position relative to the center and the path of the ocean currents (at small-scale), and they don't cover the exact same effects in a wide area. Therefore, drought Severity-Area-Frequency (S-A-F) curve investigation is an essential task to develop decision making rule for regional drought management. This study developed the copula-based joint probability distribution of drought severity and percent of area under drought across the Lake Urmia basin, Iran. To do this end, one-month Standardized Precipitation Index (SPI) values during the 1971-2013 were applied across 24 rainfall stations in the study area. Then, seven copula functions of various families, including Clayton, Gumbel, Frank, Joe, Galambos, Plackett and Normal copulas, were used to model the joint probability distribution of drought severity and drought area. Using AIC, BIC and RMSE criteria, the Frank copula was selected as the most appropriate copula in order to develop the joint probability distribution of severity-percent of area under drought across the study area. Based on the Frank copula, the drought S-A-F curve for the study area was derived. The results indicated that severe/extreme drought and non-drought (wet) behaviors have affected the majority of study areas (Lake Urmia basin). However, the area covered by the specific semi-drought effects is limited and has been subject to significant variations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Landscape-scale distribution and density of raptor populations wintering in anthropogenic-dominated desert landscapes

    USGS Publications Warehouse

    Duerr, Adam E.; Miller, Tricia A.; Cornell Duerr, Kerri L; Lanzone, Michael J.; Fesnock, Amy; Katzner, Todd E.

    2015-01-01

    Anthropogenic development has great potential to affect fragile desert environments. Large-scale development of renewable energy infrastructure is planned for many desert ecosystems. Development plans should account for anthropogenic effects to distributions and abundance of rare or sensitive wildlife; however, baseline data on abundance and distribution of such wildlife are often lacking. We surveyed for predatory birds in the Sonoran and Mojave Deserts of southern California, USA, in an area designated for protection under the “Desert Renewable Energy Conservation Plan”, to determine how these birds are distributed across the landscape and how this distribution is affected by existing development. We developed species-specific models of resight probability to adjust estimates of abundance and density of each individual common species. Second, we developed combined-species models of resight probability for common and rare species so that we could make use of sparse data on the latter. We determined that many common species, such as red-tailed hawks, loggerhead shrikes, and especially common ravens, are associated with human development and likely subsidized by human activity. Species-specific and combined-species models of resight probability performed similarly, although the former model type provided higher quality information. Comparing abundance estimates with past surveys in the Mojave Desert suggests numbers of predatory birds associated with human development have increased while other sensitive species not associated with development have decreased. This approach gave us information beyond what we would have collected by focusing either on common or rare species, thus it provides a low-cost framework for others conducting surveys in similar desert environments outside of California.

  14. Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions

    DOE PAGES

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...

    2015-11-01

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  15. Estimating the probability of mountain pine beetle red-attack damage

    Treesearch

    Michael A Wulder; J. C. White; Barbara J Bentz; M. F. Alvarez; N. C. Coops

    2006-01-01

    Accurate spatial information on the location and extent of mountain pine beetle infestation is critical for the planning of mitigation and treatment activities. Areas of mixed forest and variable terrain present unique challenges for the detection and mapping of mountain pine beetle red-attack damage, as red-attack has a more heterogeneous distribution under these...

  16. ProbOnto: ontology and knowledge base of probability distributions.

    PubMed

    Swat, Maciej J; Grenon, Pierre; Wimalaratne, Sarala

    2016-09-01

    Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. http://probonto.org mjswat@ebi.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  17. PRODIGEN: visualizing the probability landscape of stochastic gene regulatory networks in state and time space.

    PubMed

    Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta

    2017-02-15

    Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.

  18. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  19. Interference experiment with asymmetric double slit by using 1.2-MV field emission transmission electron microscope.

    PubMed

    Harada, Ken; Akashi, Tetsuya; Niitsu, Kodai; Shimada, Keiko; Ono, Yoshimasa A; Shindo, Daisuke; Shinada, Hiroyuki; Mori, Shigeo

    2018-01-17

    Advanced electron microscopy technologies have made it possible to perform precise double-slit interference experiments. We used a 1.2-MV field emission electron microscope providing coherent electron waves and a direct detection camera system enabling single-electron detections at a sub-second exposure time. We developed a method to perform the interference experiment by using an asymmetric double-slit fabricated by a focused ion beam instrument and by operating the microscope under a "pre-Fraunhofer" condition, different from the Fraunhofer condition of conventional double-slit experiments. Here, pre-Fraunhofer condition means that each single-slit observation was performed under the Fraunhofer condition, while the double-slit observations were performed under the Fresnel condition. The interference experiments with each single slit and with the asymmetric double slit were carried out under two different electron dose conditions: high-dose for calculation of electron probability distribution and low-dose for each single electron distribution. Finally, we exemplified the distribution of single electrons by color-coding according to the above three types of experiments as a composite image.

  20. Lost in search: (Mal-)adaptation to probabilistic decision environments in children and adults.

    PubMed

    Betsch, Tilmann; Lehmann, Anne; Lindow, Stefanie; Lang, Anna; Schoemann, Martin

    2016-02-01

    Adaptive decision making in probabilistic environments requires individuals to use probabilities as weights in predecisional information searches and/or when making subsequent choices. Within a child-friendly computerized environment (Mousekids), we tracked 205 children's (105 children 5-6 years of age and 100 children 9-10 years of age) and 103 adults' (age range: 21-22 years) search behaviors and decisions under different probability dispersions (.17; .33, .83 vs. .50, .67, .83) and constraint conditions (instructions to limit search: yes vs. no). All age groups limited their depth of search when instructed to do so and when probability dispersion was high (range: .17-.83). Unlike adults, children failed to use probabilities as weights for their searches, which were largely not systematic. When examining choices, however, elementary school children (unlike preschoolers) systematically used probabilities as weights in their decisions. This suggests that an intuitive understanding of probabilities and the capacity to use them as weights during integration is not a sufficient condition for applying simple selective search strategies that place one's focus on weight distributions. PsycINFO Database Record (c) 2016 APA, all rights reserved.

  1. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    PubMed

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  3. AUTOCLASS III - AUTOMATIC CLASS DISCOVERY FROM DATA

    NASA Technical Reports Server (NTRS)

    Cheeseman, P. C.

    1994-01-01

    The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free from ad hoc quantities, and in particular free of any measures which alter the data to suit the needs of the program. As a result, the elementary classification model used lends itself easily to extensions. The standard approach to classification in much of artificial intelligence and statistical pattern recognition research involves partitioning of the data into separate subsets, known as classes. AUTOCLASS III uses the Bayesian approach in which classes are described by probability distributions over the attributes of the objects, specified by a model function and its parameters. The calculation of the probability of each object's membership in each class provides a more intuitive classification than absolute partitioning techniques. AUTOCLASS III is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or omitted. The user specifies a class probability distribution function by associating attribute sets with supplied likelihood function terms. AUTOCLASS then searches in the space of class numbers and parameters for the maximally probable combination. It returns the set of class probability function parameters, and the class membership probabilities for each data instance. AUTOCLASS III is written in Common Lisp, and is designed to be platform independent. This program has been successfully run on Symbolics and Explorer Lisp machines. It has been successfully used with the following implementations of Common LISP on the Sun: Franz Allegro CL, Lucid Common Lisp, and Austin Kyoto Common Lisp and similar UNIX platforms; under the Lucid Common Lisp implementations on VAX/VMS v5.4, VAX/Ultrix v4.1, and MIPS/Ultrix v4, rev. 179; and on the Macintosh personal computer. The minimum Macintosh required is the IIci. This program will not run under CMU Common Lisp or VAX/VMS DEC Common Lisp. A minimum of 8Mb of RAM is required for Macintosh platforms and 16Mb for workstations. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 3.5 inch diskette in Macintosh format. An electronic copy of the documentation is included on the distribution medium. AUTOCLASS was developed between March 1988 and March 1992. It was initially released in May 1991. Sun is a trademark of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. DEC, VAX, VMS, and ULTRIX are trademarks of Digital Equipment Corporation. Macintosh is a trademark of Apple Computer, Inc. Allegro CL is a registered trademark of Franz, Inc.

  4. Incorporating Skew into RMS Surface Roughness Probability Distribution

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  5. Three statistical models for estimating length of stay.

    PubMed Central

    Selvin, S

    1977-01-01

    The probability density functions implied by three methods of collecting data on the length of stay in an institution are derived. The expected values associated with these density functions are used to calculate unbiased estimates of the expected length of stay. Two of the methods require an assumption about the form of the underlying distribution of length of stay; the third method does not. The three methods are illustrated with hypothetical data exhibiting the Poisson distribution, and the third (distribution-independent) method is used to estimate the length of stay in a skilled nursing facility and in an intermediate care facility for patients enrolled in California's MediCal program. PMID:914532

  6. Three statistical models for estimating length of stay.

    PubMed

    Selvin, S

    1977-01-01

    The probability density functions implied by three methods of collecting data on the length of stay in an institution are derived. The expected values associated with these density functions are used to calculate unbiased estimates of the expected length of stay. Two of the methods require an assumption about the form of the underlying distribution of length of stay; the third method does not. The three methods are illustrated with hypothetical data exhibiting the Poisson distribution, and the third (distribution-independent) method is used to estimate the length of stay in a skilled nursing facility and in an intermediate care facility for patients enrolled in California's MediCal program.

  7. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  8. Crime and punishment: the economic burden of impunity

    NASA Astrophysics Data System (ADS)

    Gordon, M. B.; Iglesias, J. R.; Semeshenko, V.; Nadal, J. P.

    2009-03-01

    Crime is an economically relevant activity. It may represent a mechanism of wealth distribution but also a social and economic burden because of the interference with regular legal activities and the cost of the law enforcement system. Sometimes it may be less costly for the society to allow for some level of criminality. However, a drawback of such a policy is that it may lead to a high increase of criminal activity, that may become hard to reduce later on. Here we investigate the level of law enforcement required to keep crime within acceptable limits. A sharp phase transition is observed as a function of the probability of punishment. We also analyze other consequences of criminality as the growth of the economy, the inequality in the wealth distribution (the Gini coefficient) and other relevant quantities under different scenarios of criminal activity and probabilities of apprehension.

  9. Model microswimmers in channels with varying cross section

    NASA Astrophysics Data System (ADS)

    Malgaretti, Paolo; Stark, Holger

    2017-05-01

    We study different types of microswimmers moving in channels with varying cross section and thereby interacting hydrodynamically with the channel walls. Starting from the Smoluchowski equation for a dilute suspension, for which interactions among swimmers can be neglected, we derive analytic expressions for the lateral probability distribution between plane channel walls. For weakly corrugated channels, we extend the Fick-Jacobs approach to microswimmers and thereby derive an effective equation for the probability distribution along the channel axis. Two regimes arise dominated either by entropic forces due to the geometrical confinement or by the active motion. In particular, our results show that the accumulation of microswimmers at channel walls is sensitive to both the underlying swimming mechanism and the geometry of the channels. Finally, for asymmetric channel corrugation, our model predicts a rectification of microswimmers along the channel, the strength and direction of which strongly depends on the swimmer type.

  10. Performance of multi-hop parallel free-space optical communication over gamma-gamma fading channel with pointing errors.

    PubMed

    Gao, Zhengguang; Liu, Hongzhan; Ma, Xiaoping; Lu, Wei

    2016-11-10

    Multi-hop parallel relaying is considered in a free-space optical (FSO) communication system deploying binary phase-shift keying (BPSK) modulation under the combined effects of a gamma-gamma (GG) distribution and misalignment fading. Based on the best path selection criterion, the cumulative distribution function (CDF) of this cooperative random variable is derived. Then the performance of this optical mesh network is analyzed in detail. A Monte Carlo simulation is also conducted to demonstrate the effectiveness of the results for the average bit error rate (ABER) and outage probability. The numerical result proves that it needs a smaller average transmitted optical power to achieve the same ABER and outage probability when using the multi-hop parallel network in FSO links. Furthermore, the system use of more number of hops and cooperative paths can improve the quality of the communication.

  11. Probabilistic self-organizing maps for continuous data.

    PubMed

    Lopez-Rubio, Ezequiel

    2010-10-01

    The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.

  12. Hantavirus reservoir Oligoryzomys longicaudatus spatial distribution sensitivity to climate change scenarios in Argentine Patagonia

    PubMed Central

    Carbajo, Aníbal E; Vera, Carolina; González, Paula LM

    2009-01-01

    Background Oligoryzomys longicaudatus (colilargo) is the rodent responsible for hantavirus pulmonary syndrome (HPS) in Argentine Patagonia. In past decades (1967–1998), trends of precipitation reduction and surface air temperature increase have been observed in western Patagonia. We explore how the potential distribution of the hantavirus reservoir would change under different climate change scenarios based on the observed trends. Methods Four scenarios of potential climate change were constructed using temperature and precipitation changes observed in Argentine Patagonia between 1967 and 1998: Scenario 1 assumed no change in precipitation but a temperature trend as observed; scenario 2 assumed no changes in temperature but a precipitation trend as observed; Scenario 3 included changes in both temperature and precipitation trends as observed; Scenario 4 assumed changes in both temperature and precipitation trends as observed but doubled. We used a validated spatial distribution model of O. longicaudatus as a function of temperature and precipitation. From the model probability of the rodent presence was calculated for each scenario. Results If changes in precipitation follow previous trends, the probability of the colilargo presence would fall in the HPS transmission zone of northern Patagonia. If temperature and precipitation trends remain at current levels for 60 years or double in the future 30 years, the probability of the rodent presence and the associated total area of potential distribution would diminish throughout Patagonia; the areas of potential distribution for colilargos would shift eastwards. These results suggest that future changes in Patagonia climate may lower transmission risk through a reduction in the potential distribution of the rodent reservoir. Conclusion According to our model the rates of temperature and precipitation changes observed between 1967 and 1998 may produce significant changes in the rodent distribution in an equivalent period of time only in certain areas. Given that changes maintain for 60 years or double in 30 years, the hantavirus reservoir Oligoryzomys longicaudatus may contract its distribution in Argentine Patagonia extensively. PMID:19607707

  13. Singular solution of the Feller diffusion equation via a spectral decomposition.

    PubMed

    Gan, Xinjun; Waxman, David

    2015-01-01

    Feller studied a branching process and found that the distribution for this process approximately obeys a diffusion equation [W. Feller, in Proceedings of the Second Berkeley Symposium on Mathematical Statistics and Probability (University of California Press, Berkeley and Los Angeles, 1951), pp. 227-246]. This diffusion equation and its generalizations play an important role in many scientific problems, including, physics, biology, finance, and probability theory. We work under the assumption that the fundamental solution represents a probability density and should account for all of the probability in the problem. Thus, under the circumstances where the random process can be irreversibly absorbed at the boundary, this should lead to the presence of a Dirac delta function in the fundamental solution at the boundary. However, such a feature is not present in the standard approach (Laplace transformation). Here we require that the total integrated probability is conserved. This yields a fundamental solution which, when appropriate, contains a term proportional to a Dirac delta function at the boundary. We determine the fundamental solution directly from the diffusion equation via spectral decomposition. We obtain exact expressions for the eigenfunctions, and when the fundamental solution contains a Dirac delta function at the boundary, every eigenfunction of the forward diffusion operator contains a delta function. We show how these combine to produce a weight of the delta function at the boundary which ensures the total integrated probability is conserved. The solution we present covers cases where parameters are time dependent, thereby greatly extending its applicability.

  14. Singular solution of the Feller diffusion equation via a spectral decomposition

    NASA Astrophysics Data System (ADS)

    Gan, Xinjun; Waxman, David

    2015-01-01

    Feller studied a branching process and found that the distribution for this process approximately obeys a diffusion equation [W. Feller, in Proceedings of the Second Berkeley Symposium on Mathematical Statistics and Probability (University of California Press, Berkeley and Los Angeles, 1951), pp. 227-246]. This diffusion equation and its generalizations play an important role in many scientific problems, including, physics, biology, finance, and probability theory. We work under the assumption that the fundamental solution represents a probability density and should account for all of the probability in the problem. Thus, under the circumstances where the random process can be irreversibly absorbed at the boundary, this should lead to the presence of a Dirac delta function in the fundamental solution at the boundary. However, such a feature is not present in the standard approach (Laplace transformation). Here we require that the total integrated probability is conserved. This yields a fundamental solution which, when appropriate, contains a term proportional to a Dirac delta function at the boundary. We determine the fundamental solution directly from the diffusion equation via spectral decomposition. We obtain exact expressions for the eigenfunctions, and when the fundamental solution contains a Dirac delta function at the boundary, every eigenfunction of the forward diffusion operator contains a delta function. We show how these combine to produce a weight of the delta function at the boundary which ensures the total integrated probability is conserved. The solution we present covers cases where parameters are time dependent, thereby greatly extending its applicability.

  15. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    USGS Publications Warehouse

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data could result in better informed management decisions and assist in guidance for more effective estuarine restoration projects.

  16. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  17. A method of decision analysis quantifying the effects of age and comorbidities on the probability of deriving significant benefit from medical treatments

    PubMed Central

    Bean, Nigel G.; Ruberu, Ravi P.

    2017-01-01

    Background The external validity, or generalizability, of trials and guidelines has been considered poor in the context of multiple morbidity. How multiple morbidity might affect the magnitude of benefit of a given treatment, and thereby external validity, has had little study. Objective To provide a method of decision analysis to quantify the effects of age and comorbidity on the probability of deriving a given magnitude of treatment benefit. Design We developed a method to calculate probabilistically the effect of all of a patient’s comorbidities on their underlying utility, or well-being, at a future time point. From this, we derived a distribution of possible magnitudes of treatment benefit at that future time point. We then expressed this distribution as the probability of deriving at least a given magnitude of treatment benefit. To demonstrate the applicability of this method of decision analysis, we applied it to the treatment of hypercholesterolaemia in a geriatric population of 50 individuals. We highlighted the results of four of these individuals. Results This method of analysis provided individualized quantifications of the effect of age and comorbidity on the probability of treatment benefit. The average probability of deriving a benefit, of at least 50% of the magnitude of benefit available to an individual without comorbidity, was only 0.8%. Conclusion The effects of age and comorbidity on the probability of deriving significant treatment benefits can be quantified for any individual. Even without consideration of other factors affecting external validity, these effects may be sufficient to guide decision-making. PMID:29090189

  18. Analyzing Future Flooding under Climate Change Scenario using CMIP5 Streamflow Data

    NASA Astrophysics Data System (ADS)

    Nyaupane, Narayan; Parajuli, Ranjan; Kalra, Ajay

    2017-12-01

    Flooding is the most severe and costlier natural hazard in US. The effect of climate change has intensified the scenario in recent years. Flood prevention practice along with proper understanding of flooding event can mitigate the risk of such hazard. The flood plain mapping is one of the technique to quantify the severity of the flooding. Carson City, which is one of the agricultural area in the desert of Nevada has experienced peak flood in recent year. The underlying probability distribution for the area, latest Coupled Model Intercomparison Project (CMIP5) streamflow data of Carson River were analyzed for 27 different statistical distributions. The best fitted distribution underlying was used to forecast the 100yr flood (design flood). The data from 1950-2099 derived from 31 model and total 97 projections were used to predict the future streamflow. Delta change method is adopted to quantify the amount of future (2050-2099) flood. To determine the extent of flooding 3 scenarios (i) historic design flood, (ii) 500yr flood and (iii) future 100yr flood were routed on a HEC-RAS model, prepared using available terrain data. Some of the climate projection shows extreme increase in future design flood. The future design flood could be more than the historic 500yr flood. At the same time, the extent of flooding could go beyond the historic flood of 0.2% annual probability. This study suggests an approach to quantify the future flood and floodplain using climate model projections. The study would provide helpful information to the facility manager, design engineer and stake holders.

  19. A random walk rule for phase I clinical trials.

    PubMed

    Durham, S D; Flournoy, N; Rosenberger, W F

    1997-06-01

    We describe a family of random walk rules for the sequential allocation of dose levels to patients in a dose-response study, or phase I clinical trial. Patients are sequentially assigned the next higher, same, or next lower dose level according to some probability distribution, which may be determined by ethical considerations as well as the patient's response. It is shown that one can choose these probabilities in order to center dose level assignments unimodally around any target quantile of interest. Estimation of the quantile is discussed; the maximum likelihood estimator and its variance are derived under a two-parameter logistic distribution, and the maximum likelihood estimator is compared with other nonparametric estimators. Random walk rules have clear advantages: they are simple to implement, and finite and asymptotic distribution theory is completely worked out. For a specific random walk rule, we compute finite and asymptotic properties and give examples of its use in planning studies. Having the finite distribution theory available and tractable obviates the need for elaborate simulation studies to analyze the properties of the design. The small sample properties of our rule, as determined by exact theory, compare favorably to those of the continual reassessment method, determined by simulation.

  20. Predicting the probability of slip in gait: methodology and distribution study.

    PubMed

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  1. Integrated-Circuit Pseudorandom-Number Generator

    NASA Technical Reports Server (NTRS)

    Steelman, James E.; Beasley, Jeff; Aragon, Michael; Ramirez, Francisco; Summers, Kenneth L.; Knoebel, Arthur

    1992-01-01

    Integrated circuit produces 8-bit pseudorandom numbers from specified probability distribution, at rate of 10 MHz. Use of Boolean logic, circuit implements pseudorandom-number-generating algorithm. Circuit includes eight 12-bit pseudorandom-number generators, outputs are uniformly distributed. 8-bit pseudorandom numbers satisfying specified nonuniform probability distribution are generated by processing uniformly distributed outputs of eight 12-bit pseudorandom-number generators through "pipeline" of D flip-flops, comparators, and memories implementing conditional probabilities on zeros and ones.

  2. Distributions and fate of chlorinated pesticides, biomarkers and polycyclic aromatic hydrocarbons in sediments along a contamination gradient from a point-source in San Francisco Bay, California

    USGS Publications Warehouse

    Pereira, W.E.; Hostettler, F.D.; Rapp, J.B.

    1996-01-01

    The distribution and fate of chlorinated pesticides, biomarkers, and polycyclic aromatic hydrocarbons (PAHs) in surficial sediments along a contamination gradient in the Lauritzen Canal and Richmond Harbor in San Francisco Bay was investigated. Compounds were identified and quantified using gas chromatography-ion trap mass spectrometry. Biomarkers and PAHs were derived primarily from weathered petroleum. DDT was reductively dechlorinated under anoxic conditions to DDD and several minor degradation products, DDMU, DDMS, and DDNU. Under aerobic conditions, DDT was dehydrochlorinated to DDE and DBP. Aerobic degradation of DDT was diminished or inhibited in zones of high concentration, and increased significantly in zones of lower concentration: Other chlorinated pesticides identified in sediment included dieldrin and chlordane isomers. Multivariate analysis of the distributions of the DDTs suggested that there are probably two sources of DDD. In addition, DDE and DDMU are probably formed by similar mechanisms, i.e. dehydrochlorination. A steep concentration gradient existed from the Canal to the Outer Richmond Harbor, but higher levels of DDD than those found in the remainder of the Bay indicated that these contaminants are transported on particulates and colloidal organic matter from this source into San Francisco Bay. Chlorinated pesticides and PAHs may pose a potential problem to biota in San Francisco Bay.

  3. Optimized lower leg injury probability curves from post-mortem human subject tests under axial impacts

    PubMed Central

    Yoganandan, Narayan; Arun, Mike W.J.; Pintar, Frank A.; Szabo, Aniko

    2015-01-01

    Objective Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. Methods The study re-examined lower leg PMHS data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and non-injury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the co-variable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal and log-logistic distributions was based on the Akaike Information Criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. Results The mean age, stature and weight: 58.2 ± 15.1 years, 1.74 ± 0.08 m and 74.9 ± 13.8 kg. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other two distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-old at five, 25 and 50% risk levels age groups for lower leg fracture. For 25, 45 and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 kN at 25% risk; and 10.4, 8.3, and 6.6 kN at 50% risk, respectively. Conclusions This study derived axial loading-induced injury risk curves based on survival analysis using peak force and specimen age; adopting different censoring schemes; considering overly influential samples in the analysis; and assessing the quality of the distribution at discrete probability levels. Because procedures used in the present survival analysis are accepted by international automotive communities, current optimum human injury probability distributions can be used at all risk levels with more confidence in future crashworthiness applications for automotive and other disciplines. PMID:25307381

  4. Synthesis and characterization of magnetic poly(divinyl benzene)/Fe3O4, C/Fe3O4/Fe, and C/Fe onionlike fullerene micrometer-sized particles with a narrow size distribution.

    PubMed

    Snovski, Ron; Grinblat, Judith; Margel, Shlomo

    2011-09-06

    Magnetic poly(divinyl benzene)/Fe(3)O(4) microspheres with a narrow size distribution were produced by entrapping the iron pentacarbonyl precursor within the pores of uniform porous poly(divinyl benzene) microspheres prepared in our laboratory, followed by the decomposition in a sealed cell of the entrapped Fe(CO)(5) particles at 300 °C under an inert atmosphere. Magnetic onionlike fullerene microspheres with a narrow size distribution were produced by annealing the obtained PDVB/Fe(3)O(4) particles at 500, 600, 800, and 1100 °C, respectively, under an inert atmosphere. The formation of carbon graphitic layers at low temperatures such as 500 °C is unique and probably obtained because of the presence of the magnetic iron nanoparticles. The annealing temperature allowed control of the composition, size, size distribution, crystallinity, porosity, and magnetic properties of the produced magnetic microspheres. © 2011 American Chemical Society

  5. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  6. Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2009-01-01

    Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.

  7. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Note on Two-Phase Phenomena in Financial Markets

    NASA Astrophysics Data System (ADS)

    Jiang, Shi-Mei; Cai, Shi-Min; Zhou, Tao; Zhou, Pei-Ling

    2008-06-01

    The two-phase behaviour in financial markets actually means the bifurcation phenomenon, which represents the change of the conditional probability from an unimodal to a bimodal distribution. We investigate the bifurcation phenomenon in Hang-Seng index. It is observed that the bifurcation phenomenon in financial index is not universal, but specific under certain conditions. For Hang-Seng index and randomly generated time series, the phenomenon just emerges when the power-law exponent of absolute increment distribution is between 1 and 2 with appropriate period. Simulations on a randomly generated time series suggest the bifurcation phenomenon itself is subject to the statistics of absolute increment, thus it may not be able to reflect essential financial behaviours. However, even under the same distribution of absolute increment, the range where bifurcation phenomenon occurs is far different from real market to artificial data, which may reflect certain market information.

  8. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  9. Positive phase space distributions and uncertainty relations

    NASA Technical Reports Server (NTRS)

    Kruger, Jan

    1993-01-01

    In contrast to a widespread belief, Wigner's theorem allows the construction of true joint probabilities in phase space for distributions describing the object system as well as for distributions depending on the measurement apparatus. The fundamental role of Heisenberg's uncertainty relations in Schroedinger form (including correlations) is pointed out for these two possible interpretations of joint probability distributions. Hence, in order that a multivariate normal probability distribution in phase space may correspond to a Wigner distribution of a pure or a mixed state, it is necessary and sufficient that Heisenberg's uncertainty relation in Schroedinger form should be satisfied.

  10. Water shortage risk assessment considering large-scale regional transfers: a copula-based uncertainty case study in Lunan, China.

    PubMed

    Gao, Xueping; Liu, Yinzhu; Sun, Bowen

    2018-06-05

    The risk of water shortage caused by uncertainties, such as frequent drought, varied precipitation, multiple water resources, and different water demands, brings new challenges to the water transfer projects. Uncertainties exist for transferring water and local surface water; therefore, the relationship between them should be thoroughly studied to prevent water shortage. For more effective water management, an uncertainty-based water shortage risk assessment model (UWSRAM) is developed to study the combined effect of multiple water resources and analyze the shortage degree under uncertainty. The UWSRAM combines copula-based Monte Carlo stochastic simulation and the chance-constrained programming-stochastic multiobjective optimization model, using the Lunan water-receiving area in China as an example. Statistical copula functions are employed to estimate the joint probability of available transferring water and local surface water and sampling from the multivariate probability distribution, which are used as inputs for the optimization model. The approach reveals the distribution of water shortage and is able to emphasize the importance of improving and updating transferring water and local surface water management, and examine their combined influence on water shortage risk assessment. The possible available water and shortages can be calculated applying the UWSRAM, also with the corresponding allocation measures under different water availability levels and violating probabilities. The UWSRAM is valuable for mastering the overall multi-water resource and water shortage degree, adapting to the uncertainty surrounding water resources, establishing effective water resource planning policies for managers and achieving sustainable development.

  11. Space Station laboratory module power loading analysis

    NASA Astrophysics Data System (ADS)

    Fu, S. J.

    1994-07-01

    The electrical power system of Space Station Freedom is an isolated electrical power generation and distribution network designed to meet the demands of a large number of electrical loads. An algorithm is developed to determine the power bus loading status under normal operating conditions to ensure the supply meets demand. The probabilities of power availability for payload operations (experiments) are also derived.

  12. Jamming Dust: A Low-Power Distributed Jammer Network

    DTIC Science & Technology

    2010-12-01

    0.206 jA r f ee A r λ λ β λ −− ≤ − − (12) , again h λ = 2 to conform with results in [19]. Recall that γ tion probability. On the other hand...of service in sensor [4] 003. 2003. 2005. , pages 80-89, 2004. . King and E. EEE 802.11 under [16] darpa.mil/STO/strategic

  13. 76 FR 39278 - Modification of Treasury Regulations Pursuant to Section 939A of the Dodd-Frank Wall Street...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-06

    ... adjustment implicit in the yield curve used to discount the present value of the cash flows. This adjustment... valuation date, X determines a mid-market probability distribution of future cash flows under the derivatives and computes the present values of these cash flows. In computing these present values, X uses an...

  14. How likely are constituent quanta to initiate inflation?

    DOE PAGES

    Berezhiani, Lasha; Trodden, Mark

    2015-08-06

    In this study, we propose an intuitive framework for studying the problem of initial conditions in slow-roll inflation. In particular, we consider a universe at high, but sub-Planckian energy density and analyze the circumstances under which it is plausible for it to become dominated by inflated patches at late times, without appealing to the idea of self-reproduction. Our approach is based on defining a prior probability distribution for the constituent quanta of the pre-inflationary universe. To test the idea that inflation can begin under very generic circumstances, we make specific – yet quite general and well grounded – assumptions onmore » the prior distribution. As a result, we are led to the conclusion that the probability for a given region to ignite inflation at sub-Planckian densities is extremely small. Furthermore, if one chooses to use the enormous volume factor that inflation yields as an appropriate measure, we find that the regions of the universe which started inflating at densities below the self-reproductive threshold nevertheless occupy a negligible physical volume in the present universe as compared to those domains that have never inflated.« less

  15. Conditional probability distribution function of "energy transfer rate" (PDF(ɛ|PVI)) as compared with its counterpart of temperature (PDF(T|PVI)) at the same condition of fluctuation

    NASA Astrophysics Data System (ADS)

    He, Jiansen; Wang, Yin; Pei, Zhongtian; Zhang, Lei; Tu, Chuanyi

    2017-04-01

    Energy transfer rate of turbulence is not uniform everywhere but suggested to follow a certain distribution, e.g., lognormal distribution (Kolmogorov 1962). The inhomogeneous transfer rate leads to emergence of intermittency, which may be identified with some parameter, e.g., normalized partial variance increments (PVI) (Greco et al., 2009). Large PVI of magnetic field fluctuations are found to have a temperature distribution with the median and mean values higher than that for small PVI level (Osman et al., 2012). However, there is a large proportion of overlap between temperature distributions associated with the smaller and larger PVIs. So it is recognized that only PVI cannot fully determine the temperature, since the one-to-one mapping relationship does not exist. One may be curious about the reason responsible for the considerable overlap of conditional temperature distribution for different levels of PVI. Usually the hotter plasma with higher temperature is speculated to be heated more with more dissipation of turbulence energy corresponding to more energy cascading rate, if the temperature fluctuation of the eigen wave mode is not taken into account. To explore the statistical relationship between turbulence cascading and plasma thermal state, we aim to study and reveal, for the first time, the conditional probability function of "energy transfer rate" under different levels of PVI condition (PDF(ɛ|PVI)), and compare it with the conditional probability function of temperature. The conditional probability distribution function, PDF(ɛ|PVI), is derived from PDF(PVI|ɛ)·PDF(ɛ)/PDF(PVI) according to the Bayesian theorem. PDF(PVI) can be obtained directly from the data. PDF(ɛ) is derived from the conjugate-gradient inversion of PDF(PVI) by assuming reasonably that PDF(δB|σ) is a Gaussian distribution, where PVI=|δB|/ σ and σ ( ɛι)1/3. PDF(ɛ) can also be acquired from fitting PDF(δB) with an integral function ∫PDF(δB|σ)PDF(σ)d σ. As a result, PDF(ɛ|PVI) is found to shift to higher median value of ɛ with increasing PVI but with a significant overlap of PDFs for different PVIs. Therefore, PDF(ɛ|PVI) is similar to PDF(T|PVI) in the sense of slow migration along with increasing PVI. The detailed comparison between these two conditional PDFs are also performed.

  16. Spatial Probability Distribution of Strata's Lithofacies and its Impacts on Land Subsidence in Huairou Emergency Water Resources Region of Beijing

    NASA Astrophysics Data System (ADS)

    Li, Y.; Gong, H.; Zhu, L.; Guo, L.; Gao, M.; Zhou, C.

    2016-12-01

    Continuous over-exploitation of groundwater causes dramatic drawdown, and leads to regional land subsidence in the Huairou Emergency Water Resources region, which is located in the up-middle part of the Chaobai river basin of Beijing. Owing to the spatial heterogeneity of strata's lithofacies of the alluvial fan, ground deformation has no significant positive correlation with groundwater drawdown, and one of the challenges ahead is to quantify the spatial distribution of strata's lithofacies. The transition probability geostatistics approach provides potential for characterizing the distribution of heterogeneous lithofacies in the subsurface. Combined the thickness of clay layer extracted from the simulation, with deformation field acquired from PS-InSAR technology, the influence of strata's lithofacies on land subsidence can be analyzed quantitatively. The strata's lithofacies derived from borehole data were generalized into four categories and their probability distribution in the observe space was mined by using the transition probability geostatistics, of which clay was the predominant compressible material. Geologically plausible realizations of lithofacies distribution were produced, accounting for complex heterogeneity in alluvial plain. At a particular probability level of more than 40 percent, the volume of clay defined was 55 percent of the total volume of strata's lithofacies. This level, equaling nearly the volume of compressible clay derived from the geostatistics, was thus chosen to represent the boundary between compressible and uncompressible material. The method incorporates statistical geological information, such as distribution proportions, average lengths and juxtaposition tendencies of geological types, mainly derived from borehole data and expert knowledge, into the Markov chain model of transition probability. Some similarities of patterns were indicated between the spatial distribution of deformation field and clay layer. In the area with roughly similar water table decline, locations in the subsurface having a higher probability for the existence of compressible material occur more than that in the location with a lower probability. Such estimate of spatial probability distribution is useful to analyze the uncertainty of land subsidence.

  17. The exact probability distribution of the rank product statistics for replicated experiments.

    PubMed

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  18. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  19. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  20. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  1. Semiparametric Bayesian classification with longitudinal markers

    PubMed Central

    De la Cruz-Mesía, Rolando; Quintana, Fernando A.; Müller, Peter

    2013-01-01

    Summary We analyse data from a study involving 173 pregnant women. The data are observed values of the β human chorionic gonadotropin hormone measured during the first 80 days of gestational age, including from one up to six longitudinal responses for each woman. The main objective in this study is to predict normal versus abnormal pregnancy outcomes from data that are available at the early stages of pregnancy. We achieve the desired classification with a semiparametric hierarchical model. Specifically, we consider a Dirichlet process mixture prior for the distribution of the random effects in each group. The unknown random-effects distributions are allowed to vary across groups but are made dependent by using a design vector to select different features of a single underlying random probability measure. The resulting model is an extension of the dependent Dirichlet process model, with an additional probability model for group classification. The model is shown to perform better than an alternative model which is based on independent Dirichlet processes for the groups. Relevant posterior distributions are summarized by using Markov chain Monte Carlo methods. PMID:24368871

  2. Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar

    PubMed Central

    Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le

    2016-01-01

    Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar’s estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method. PMID:27618058

  3. Allowances for evolving coastal flood risk under uncertain local sea-level rise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buchanan, Maya K.; Kopp, Robert E.; Oppenheimer, Michael

    Estimates of future flood hazards made under the assumption of stationary mean sea level are biased low due to sea-level rise (SLR). However, adjustments to flood return levels made assuming fixed increases of sea level are also inadequate when applied to sea level that is rising over time at an uncertain rate. SLR allowances—the height adjustment from historic flood levels that maintain under uncertainty the annual expected probability of flooding—are typically estimated independently of individual decision-makers’ preferences, such as time horizon, risk tolerance, and confidence in SLR projections.We provide a framework of SLR allowances that employs complete probability distributions ofmore » local SLR and a range of user-defined flood risk management preferences. Given non-stationary and uncertain sea-level rise, these metrics provide estimates of flood protection heights and offsets for different planning horizons in coastal areas. In conclusion, we illustrate the calculation of various allowance types for a set of long-duration tide gauges along U.S. coastlines.« less

  4. Allowances for evolving coastal flood risk under uncertain local sea-level rise

    DOE PAGES

    Buchanan, Maya K.; Kopp, Robert E.; Oppenheimer, Michael; ...

    2016-06-03

    Estimates of future flood hazards made under the assumption of stationary mean sea level are biased low due to sea-level rise (SLR). However, adjustments to flood return levels made assuming fixed increases of sea level are also inadequate when applied to sea level that is rising over time at an uncertain rate. SLR allowances—the height adjustment from historic flood levels that maintain under uncertainty the annual expected probability of flooding—are typically estimated independently of individual decision-makers’ preferences, such as time horizon, risk tolerance, and confidence in SLR projections.We provide a framework of SLR allowances that employs complete probability distributions ofmore » local SLR and a range of user-defined flood risk management preferences. Given non-stationary and uncertain sea-level rise, these metrics provide estimates of flood protection heights and offsets for different planning horizons in coastal areas. In conclusion, we illustrate the calculation of various allowance types for a set of long-duration tide gauges along U.S. coastlines.« less

  5. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  6. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    NASA Astrophysics Data System (ADS)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  7. Distributed fault detection over sensor networks with Markovian switching topologies

    NASA Astrophysics Data System (ADS)

    Ge, Xiaohua; Han, Qing-Long

    2014-05-01

    This paper deals with the distributed fault detection for discrete-time Markov jump linear systems over sensor networks with Markovian switching topologies. The sensors are scatteredly deployed in the sensor field and the fault detectors are physically distributed via a communication network. The system dynamics changes and sensing topology variations are modeled by a discrete-time Markov chain with incomplete mode transition probabilities. Each of these sensor nodes firstly collects measurement outputs from its all underlying neighboring nodes, processes these data in accordance with the Markovian switching topologies, and then transmits the processed data to the remote fault detector node. Network-induced delays and accumulated data packet dropouts are incorporated in the data transmission between the sensor nodes and the distributed fault detector nodes through the communication network. To generate localized residual signals, mode-independent distributed fault detection filters are proposed. By means of the stochastic Lyapunov functional approach, the residual system performance analysis is carried out such that the overall residual system is stochastically stable and the error between each residual signal and the fault signal is made as small as possible. Furthermore, a sufficient condition on the existence of the mode-independent distributed fault detection filters is derived in the simultaneous presence of incomplete mode transition probabilities, Markovian switching topologies, network-induced delays, and accumulated data packed dropouts. Finally, a stirred-tank reactor system is given to show the effectiveness of the developed theoretical results.

  8. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    NASA Astrophysics Data System (ADS)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  9. The extraction and integration framework: a two-process account of statistical learning.

    PubMed

    Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G

    2013-07-01

    The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved

  10. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  11. Probabilistic accounting of uncertainty in forecasts of species distributions under climate change

    USGS Publications Warehouse

    Wenger, Seth J.; Som, Nicholas A.; Dauwalter, Daniel C.; Isaak, Daniel J.; Neville, Helen M.; Luce, Charles H.; Dunham, Jason B.; Young, Michael K.; Fausch, Kurt D.; Rieman, Bruce E.

    2013-01-01

    Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing models (model uncertainty), and uncertainty in future climate conditions (climate uncertainty) to produce site-specific frequency distributions of occurrence probabilities across a species’ range. We illustrated the method by forecasting suitable habitat for bull trout (Salvelinus confluentus) in the Interior Columbia River Basin, USA, under recent and projected 2040s and 2080s climate conditions. The 95% interval of total suitable habitat under recent conditions was estimated at 30.1–42.5 thousand km; this was predicted to decline to 0.5–7.9 thousand km by the 2080s. Projections for the 2080s showed that the great majority of stream segments would be unsuitable with high certainty, regardless of the climate data set or bull trout model employed. The largest contributor to uncertainty in total suitable habitat was climate uncertainty, followed by parameter uncertainty and model uncertainty. Our approach makes it possible to calculate a full distribution of possible outcomes for a species, and permits ready graphical display of uncertainty for individual locations and of total habitat.

  12. Modeling Invasion Dynamics with Spatial Random-Fitness Due to Micro-Environment

    PubMed Central

    Manem, V. S. K.; Kaveh, K.; Kohandel, M.; Sivaloganathan, S.

    2015-01-01

    Numerous experimental studies have demonstrated that the microenvironment is a key regulator influencing the proliferative and migrative potentials of species. Spatial and temporal disturbances lead to adverse and hazardous microenvironments for cellular systems that is reflected in the phenotypic heterogeneity within the system. In this paper, we study the effect of microenvironment on the invasive capability of species, or mutants, on structured grids (in particular, square lattices) under the influence of site-dependent random proliferation in addition to a migration potential. We discuss both continuous and discrete fitness distributions. Our results suggest that the invasion probability is negatively correlated with the variance of fitness distribution of mutants (for both advantageous and neutral mutants) in the absence of migration of both types of cells. A similar behaviour is observed even in the presence of a random fitness distribution of host cells in the system with neutral fitness rate. In the case of a bimodal distribution, we observe zero invasion probability until the system reaches a (specific) proportion of advantageous phenotypes. Also, we find that the migrative potential amplifies the invasion probability as the variance of fitness of mutants increases in the system, which is the exact opposite in the absence of migration. Our computational framework captures the harsh microenvironmental conditions through quenched random fitness distributions and migration of cells, and our analysis shows that they play an important role in the invasion dynamics of several biological systems such as bacterial micro-habitats, epithelial dysplasia, and metastasis. We believe that our results may lead to more experimental studies, which can in turn provide further insights into the role and impact of heterogeneous environments on invasion dynamics. PMID:26509572

  13. Universal Inverse Power-Law Distribution for Fractal Fluctuations in Dynamical Systems: Applications for Predictability of Inter-Annual Variability of Indian and USA Region Rainfall

    NASA Astrophysics Data System (ADS)

    Selvam, A. M.

    2017-01-01

    Dynamical systems in nature exhibit self-similar fractal space-time fluctuations on all scales indicating long-range correlations and, therefore, the statistical normal distribution with implicit assumption of independence, fixed mean and standard deviation cannot be used for description and quantification of fractal data sets. The author has developed a general systems theory based on classical statistical physics for fractal fluctuations which predicts the following. (1) The fractal fluctuations signify an underlying eddy continuum, the larger eddies being the integrated mean of enclosed smaller-scale fluctuations. (2) The probability distribution of eddy amplitudes and the variance (square of eddy amplitude) spectrum of fractal fluctuations follow the universal Boltzmann inverse power law expressed as a function of the golden mean. (3) Fractal fluctuations are signatures of quantum-like chaos since the additive amplitudes of eddies when squared represent probability densities analogous to the sub-atomic dynamics of quantum systems such as the photon or electron. (4) The model predicted distribution is very close to statistical normal distribution for moderate events within two standard deviations from the mean but exhibits a fat long tail that are associated with hazardous extreme events. Continuous periodogram power spectral analyses of available GHCN annual total rainfall time series for the period 1900-2008 for Indian and USA stations show that the power spectra and the corresponding probability distributions follow model predicted universal inverse power law form signifying an eddy continuum structure underlying the observed inter-annual variability of rainfall. On a global scale, man-made greenhouse gas related atmospheric warming would result in intensification of natural climate variability, seen immediately in high frequency fluctuations such as QBO and ENSO and even shorter timescales. Model concepts and results of analyses are discussed with reference to possible prediction of climate change. Model concepts, if correct, rule out unambiguously, linear trends in climate. Climate change will only be manifested as increase or decrease in the natural variability. However, more stringent tests of model concepts and predictions are required before applications to such an important issue as climate change. Observations and simulations with climate models show that precipitation extremes intensify in response to a warming climate (O'Gorman in Curr Clim Change Rep 1:49-59, 2015).

  14. A bottom-up robust optimization framework for identifying river basin development pathways under deep climate uncertainty

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Ray, P.; Brown, C.

    2016-12-01

    Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.

  15. Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.

    1971-01-01

    A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.

  16. Fitting distributions to microbial contamination data collected with an unequal probability sampling design.

    PubMed

    Williams, M S; Ebel, E D; Cao, Y

    2013-01-01

    The fitting of statistical distributions to microbial sampling data is a common application in quantitative microbiology and risk assessment applications. An underlying assumption of most fitting techniques is that data are collected with simple random sampling, which is often times not the case. This study develops a weighted maximum likelihood estimation framework that is appropriate for microbiological samples that are collected with unequal probabilities of selection. A weighted maximum likelihood estimation framework is proposed for microbiological samples that are collected with unequal probabilities of selection. Two examples, based on the collection of food samples during processing, are provided to demonstrate the method and highlight the magnitude of biases in the maximum likelihood estimator when data are inappropriately treated as a simple random sample. Failure to properly weight samples to account for how data are collected can introduce substantial biases into inferences drawn from the data. The proposed methodology will reduce or eliminate an important source of bias in inferences drawn from the analysis of microbial data. This will also make comparisons between studies and the combination of results from different studies more reliable, which is important for risk assessment applications. © 2012 No claim to US Government works.

  17. Threshold-selecting strategy for best possible ground state detection with genetic algorithms

    NASA Astrophysics Data System (ADS)

    Lässig, Jörg; Hoffmann, Karl Heinz

    2009-04-01

    Genetic algorithms are a standard heuristic to find states of low energy in complex state spaces as given by physical systems such as spin glasses but also in combinatorial optimization. The paper considers the problem of selecting individuals in the current population in genetic algorithms for crossover. Many schemes have been considered in literature as possible crossover selection strategies. We show for a large class of quality measures that the best possible probability distribution for selecting individuals in each generation of the algorithm execution is a rectangular distribution over the individuals sorted by their energy values. This means uniform probabilities have to be assigned to a group of the individuals with lowest energy in the population but probabilities equal to zero to individuals which are corresponding to energy values higher than a fixed cutoff, which is equal to a certain rank in the vector sorted by the energy of the states in the current population. The considered strategy is dubbed threshold selecting. The proof applies basic arguments of Markov chains and linear optimization and makes only a few assumptions on the underlying principles and hence applies to a large class of algorithms.

  18. Isotropic probability measures in infinite dimensional spaces: Inverse problems/prior information/stochastic inversion

    NASA Technical Reports Server (NTRS)

    Backus, George

    1987-01-01

    Let R be the real numbers, R(n) the linear space of all real n-tuples, and R(infinity) the linear space of all infinite real sequences x = (x sub 1, x sub 2,...). Let P sub n :R(infinity) approaches R(n) be the projection operator with P sub n (x) = (x sub 1,...,x sub n). Let p(infinity) be a probability measure on the smallest sigma-ring of subsets of R(infinity) which includes all of the cylinder sets P sub n(-1) (B sub n), where B sub n is an arbitrary Borel subset of R(n). Let p sub n be the marginal distribution of p(infinity) on R(n), so p sub n(B sub n) = p(infinity)(P sub n to the -1(B sub n)) for each B sub n. A measure on R(n) is isotropic if it is invariant under all orthogonal transformations of R(n). All members of the set of all isotropic probability distributions on R(n) are described. The result calls into question both stochastic inversion and Bayesian inference, as currently used in many geophysical inverse problems.

  19. Stochastic growth of cloud droplets by collisions during settling

    NASA Astrophysics Data System (ADS)

    Madival, Deepak G.

    2018-04-01

    In the last stage of droplet growth in clouds which leads to drizzle formation, larger droplets begin to settle under gravity and collide and coalesce with smaller droplets in their path. In this article, we shall deal with the simplified problem of a large drop settling amidst a population of identical smaller droplets. We present an expression for the probability that a given large drop suffers a given number of collisions, for a general statistically homogeneous distribution of droplets. We hope that our approach will serve as a valuable tool in dealing with droplet distribution in real clouds, which has been found to deviate from the idealized Poisson distribution due to mechanisms such as inertial clustering.

  20. We are not the 99 percent: quantifying asphericity in the distribution of Local Group satellites

    NASA Astrophysics Data System (ADS)

    Forero-Romero, Jaime E.; Arias, Verónica

    2018-05-01

    We use simulations to build an analytic probability distribution for the asphericity in the satellite distribution around Local Group (LG) type galaxies in the Lambda Cold Dark Matter (LCDM) paradigm. We use this distribution to estimate the atypicality of the satellite distributions in the LG even when the underlying simulations do not have enough systems fully resembling the LG in terms of its typical masses, separation and kinematics. We demonstrate the method using three different simulations (Illustris-1, Illustris-1-Dark and ELVIS) and a number of satellites ranging from 11 to 15. Detailed results differ greatly among the simulations suggesting a strong influence of the typical DM halo mass, the number of satellites and the simulated baryonic effects. However, there are three common trends. First, at most 2% of the pairs are expected to have satellite distributions with the same asphericity as the LG; second, at most 80% of the pairs have a halo with a satellite distribution as aspherical as in M31; and third, at most 4% of the pairs have a halo with satellite distribution as planar as in the MW. These quantitative results place the LG at the level of a 3σ outlier in the LCDM paradigm. We suggest that understanding the reasons for this atypicality requires quantifying the asphericity probability distribution as a function of halo mass and large scale environment. The approach presented here can facilitate that kind of study and other comparisons between different numerical setups and choices to study satellites around LG pairs in simulations.

  1. Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow

    NASA Astrophysics Data System (ADS)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-12-01

    Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.

  2. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    NASA Astrophysics Data System (ADS)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  3. Probability Distribution of Turbulent Kinetic Energy Dissipation Rate in Ocean: Observations and Approximations

    NASA Astrophysics Data System (ADS)

    Lozovatsky, I.; Fernando, H. J. S.; Planella-Morato, J.; Liu, Zhiyu; Lee, J.-H.; Jinadasa, S. U. P.

    2017-10-01

    The probability distribution of turbulent kinetic energy dissipation rate in stratified ocean usually deviates from the classic lognormal distribution that has been formulated for and often observed in unstratified homogeneous layers of atmospheric and oceanic turbulence. Our measurements of vertical profiles of micro-scale shear, collected in the East China Sea, northern Bay of Bengal, to the south and east of Sri Lanka, and in the Gulf Stream region, show that the probability distributions of the dissipation rate ɛ˜r in the pycnoclines (r ˜ 1.4 m is the averaging scale) can be successfully modeled by the Burr (type XII) probability distribution. In weakly stratified boundary layers, lognormal distribution of ɛ˜r is preferable, although the Burr is an acceptable alternative. The skewness Skɛ and the kurtosis Kɛ of the dissipation rate appear to be well correlated in a wide range of Skɛ and Kɛ variability.

  4. Maximum-entropy probability distributions under Lp-norm constraints

    NASA Technical Reports Server (NTRS)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  5. Structural Health Monitoring System Trade Space Analysis Tool with Consideration for Crack Growth, Sensor Degradation and a Variable Detection Threshold

    DTIC Science & Technology

    2014-09-18

    Erdogan , 1963). 26 Paris’s Law Under a fatigue stress regime Paris’s Law relates sub-critical crack growth to stress intensity factor. The basic...Paris and Erdogan , 1963). After takeoff, the model generates a probability distribution for the crack length in that specific sortie based on the...Law is one of the most widely used fatigue crack growth models and was used in this research effort (Paris and Erdogan , 1963). Paris’s Law Under a

  6. A goodness-of-fit test for capture-recapture model M(t) under closure

    USGS Publications Warehouse

    Stanley, T.R.; Burnham, K.P.

    1999-01-01

    A new, fully efficient goodness-of-fit test for the time-specific closed-population capture-recapture model M(t) is presented. This test is based on the residual distribution of the capture history data given the maximum likelihood parameter estimates under model M(t), is partitioned into informative components, and is based on chi-square statistics. Comparison of this test with Leslie's test (Leslie, 1958, Journal of Animal Ecology 27, 84- 86) for model M(t), using Monte Carlo simulations, shows the new test generally outperforms Leslie's test. The new test is frequently computable when Leslie's test is not, has Type I error rates that are closer to nominal error rates than Leslie's test, and is sensitive to behavioral variation and heterogeneity in capture probabilities. Leslie's test is not sensitive to behavioral variation in capture probabilities but, when computable, has greater power to detect heterogeneity than the new test.

  7. Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences.

    PubMed

    Koelsch, Stefan; Busch, Tobias; Jentschke, Sebastian; Rohrmeier, Martin

    2016-02-02

    Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic, and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities.

  8. Defense strategies for asymmetric networked systems under composite utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Ma, Chris Y. T.; Hausken, Kjell

    We consider an infrastructure of networked systems with discrete components that can be reinforced at certain costs to guard against attacks. The communications network plays a critical, asymmetric role of providing the vital connectivity between the systems. We characterize the correlations within this infrastructure at two levels using (a) aggregate failure correlation function that specifies the infrastructure failure probability giventhe failure of an individual system or network, and (b) first order differential conditions on system survival probabilities that characterize component-level correlations. We formulate an infrastructure survival game between an attacker and a provider, who attacks and reinforces individual components, respectively.more » They use the composite utility functions composed of a survival probability term and a cost term, and the previously studiedsum-form and product-form utility functions are their special cases. At Nash Equilibrium, we derive expressions for individual system survival probabilities and the expected total number of operational components. We apply and discuss these estimates for a simplified model of distributed cloud computing infrastructure« less

  9. Modelling the Effects of Temperature and Cloud Cover Change on Mountain Permafrost Distribution, Northwest Canada

    NASA Astrophysics Data System (ADS)

    Bonnaventure, P. P.; Lewkowicz, A. G.

    2008-12-01

    Spatial models of permafrost probability for three study areas in northwest Canada between 59°N and 61°N were perturbed to investigate climate change impacts. The models are empirical-statistical in nature, based on basal temperature of snow (BTS) measurements in winter, and summer ground-truthing of the presence or absence of frozen ground. Predictions of BTS values are made using independent variables of elevation and potential incoming solar radiation (PISR), both derived from a 30 m DEM. These are then transformed into the probability of the presence or absence of permafrost through logistic regression. Under present climate conditions, permafrost percentages in the study areas are 44% for Haines Summit, British Columbia, 38% for Wolf Creek, Yukon, and 69% for part of the Ruby Range, Yukon (Bonnaventure and Lewkowicz, 2008; Lewkowicz and Bonaventure, 2008). Scenarios of air temperature change from -2K (approximating Neoglacial conditions) to +5K (possible within the next century according to the IPCC) were examined for the three sites. Manipulations were carried out by lowering or raising the terrain within the DEM assuming a mean environmental lapse rate of 6.5K/km. Under a -2K scenario, permafrost extent increased by 22-43% in the three study areas. Under a +5K warming, permafrost essentially disappeared in Haines Summit and Wolf Creek, while in the Ruby Range less than 12% of the area remained perennially frozen. It should be emphasized that these model predictions are for equilibrium conditions which might not be attained for several decades or longer in areas of cold permafrost. Cloud cover changes of -10% to +10% were examined through adjusting the partitioning of direct beam and diffuse radiation in the PISR input field. Changes to permafrost extent were small, ranging from -2% to -4% for greater cloudiness with changes of the opposite magnitude for less cloud. The results show that air temperature change has a much greater potential to affect mountain permafrost distribution in the long-term than the probable range of cloud cover changes. Modelled results for the individual areas respond according to the hypsometry of the terrain and the relative strength of elevation and PISR in the regression models. This study indicates that significant changes to the distribution and extent of mountain permafrost in northwest Canada can be expected in the next few decades. References Bonnaventure, P.P. and Lewkowicz, A.G. (2008). Mountain permafrost probability mapping using the BTS method in two climatically dissimilar locations, northwest Canada. Canadian Journal of Earth Sciences, 45, 443-455. Lewkowicz, A.G. and Bonnaventure, P.P. (2008). Interchangeability of local mountain permafrost probability models, northwest Canada. Permafrost and Periglacial Processes, 19, 49-62.

  10. Architectures of Kepler Planet Systems with Approximate Bayesian Computation

    NASA Astrophysics Data System (ADS)

    Morehead, Robert C.; Ford, Eric B.

    2015-12-01

    The distribution of period normalized transit duration ratios among Kepler’s multiple transiting planet systems constrains the distributions of mutual orbital inclinations and orbital eccentricities. However, degeneracies in these parameters tied to the underlying number of planets in these systems complicate their interpretation. To untangle the true architecture of planet systems, the mutual inclination, eccentricity, and underlying planet number distributions must be considered simultaneously. The complexities of target selection, transit probability, detection biases, vetting, and follow-up observations make it impractical to write an explicit likelihood function. Approximate Bayesian computation (ABC) offers an intriguing path forward. In its simplest form, ABC generates a sample of trial population parameters from a prior distribution to produce synthetic datasets via a physically-motivated forward model. Samples are then accepted or rejected based on how close they come to reproducing the actual observed dataset to some tolerance. The accepted samples form a robust and useful approximation of the true posterior distribution of the underlying population parameters. We build on the considerable progress from the field of statistics to develop sequential algorithms for performing ABC in an efficient and flexible manner. We demonstrate the utility of ABC in exoplanet populations and present new constraints on the distributions of mutual orbital inclinations, eccentricities, and the relative number of short-period planets per star. We conclude with a discussion of the implications for other planet occurrence rate calculations, such as eta-Earth.

  11. Evaluation of the Three Parameter Weibull Distribution Function for Predicting Fracture Probability in Composite Materials

    DTIC Science & Technology

    1978-03-01

    for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented

  12. Ignition probability of polymer-bonded explosives accounting for multiple sources of material stochasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S.; Barua, A.; Zhou, M., E-mail: min.zhou@me.gatech.edu

    2014-05-07

    Accounting for the combined effect of multiple sources of stochasticity in material attributes, we develop an approach that computationally predicts the probability of ignition of polymer-bonded explosives (PBXs) under impact loading. The probabilistic nature of the specific ignition processes is assumed to arise from two sources of stochasticity. The first source involves random variations in material microstructural morphology; the second source involves random fluctuations in grain-binder interfacial bonding strength. The effect of the first source of stochasticity is analyzed with multiple sets of statistically similar microstructures and constant interfacial bonding strength. Subsequently, each of the microstructures in the multiple setsmore » is assigned multiple instantiations of randomly varying grain-binder interfacial strengths to analyze the effect of the second source of stochasticity. Critical hotspot size-temperature states reaching the threshold for ignition are calculated through finite element simulations that explicitly account for microstructure and bulk and interfacial dissipation to quantify the time to criticality (t{sub c}) of individual samples, allowing the probability distribution of the time to criticality that results from each source of stochastic variation for a material to be analyzed. Two probability superposition models are considered to combine the effects of the multiple sources of stochasticity. The first is a parallel and series combination model, and the second is a nested probability function model. Results show that the nested Weibull distribution provides an accurate description of the combined ignition probability. The approach developed here represents a general framework for analyzing the stochasticity in the material behavior that arises out of multiple types of uncertainty associated with the structure, design, synthesis and processing of materials.« less

  13. Best-Practice Criteria for Practical Security of Self-Differencing Avalanche Photodiode Detectors in Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Koehler-Sidki, A.; Dynes, J. F.; Lucamarini, M.; Roberts, G. L.; Sharpe, A. W.; Yuan, Z. L.; Shields, A. J.

    2018-04-01

    Fast-gated avalanche photodiodes (APDs) are the most commonly used single photon detectors for high-bit-rate quantum key distribution (QKD). Their robustness against external attacks is crucial to the overall security of a QKD system, or even an entire QKD network. We investigate the behavior of a gigahertz-gated, self-differencing (In,Ga)As APD under strong illumination, a tactic Eve often uses to bring detectors under her control. Our experiment and modeling reveal that the negative feedback by the photocurrent safeguards the detector from being blinded through reducing its avalanche probability and/or strengthening the capacitive response. Based on this finding, we propose a set of best-practice criteria for designing and operating fast-gated APD detectors to ensure their practical security in QKD.

  14. Security evaluation of the quantum key distribution system with two-mode squeezed states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osaki, M.; Ban, M.

    2003-08-01

    The quantum key distribution (QKD) system with two-mode squeezed states has been demonstrated by Pereira et al. [Phys. Rev. A 62, 042311 (2000)]. They evaluate the security of the system based on the signal to noise ratio attained by a homodyne detector. In this paper, we discuss its security based on the error probability individually attacked by eavesdropper with the unambiguous or the error optimum detection. The influence of the energy loss at transmission channels is also taken into account. It will be shown that the QKD system is secure under these conditions.

  15. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  16. Unified nano-mechanics based probabilistic theory of quasibrittle and brittle structures: I. Strength, static crack growth, lifetime and scaling

    NASA Astrophysics Data System (ADS)

    Le, Jia-Liang; Bažant, Zdeněk P.; Bazant, Martin Z.

    2011-07-01

    Engineering structures must be designed for an extremely low failure probability such as 10 -6, which is beyond the means of direct verification by histogram testing. This is not a problem for brittle or ductile materials because the type of probability distribution of structural strength is fixed and known, making it possible to predict the tail probabilities from the mean and variance. It is a problem, though, for quasibrittle materials for which the type of strength distribution transitions from Gaussian to Weibullian as the structure size increases. These are heterogeneous materials with brittle constituents, characterized by material inhomogeneities that are not negligible compared to the structure size. Examples include concrete, fiber composites, coarse-grained or toughened ceramics, rocks, sea ice, rigid foams and bone, as well as many materials used in nano- and microscale devices. This study presents a unified theory of strength and lifetime for such materials, based on activation energy controlled random jumps of the nano-crack front, and on the nano-macro multiscale transition of tail probabilities. Part I of this study deals with the case of monotonic and sustained (or creep) loading, and Part II with fatigue (or cyclic) loading. On the scale of the representative volume element of material, the probability distribution of strength has a Gaussian core onto which a remote Weibull tail is grafted at failure probability of the order of 10 -3. With increasing structure size, the Weibull tail penetrates into the Gaussian core. The probability distribution of static (creep) lifetime is related to the strength distribution by the power law for the static crack growth rate, for which a physical justification is given. The present theory yields a simple relation between the exponent of this law and the Weibull moduli for strength and lifetime. The benefit is that the lifetime distribution can be predicted from short-time tests of the mean size effect on strength and tests of the power law for the crack growth rate. The theory is shown to match closely numerous test data on strength and static lifetime of ceramics and concrete, and explains why their histograms deviate systematically from the straight line in Weibull scale. Although the present unified theory is built on several previous advances, new contributions are here made to address: (i) a crack in a disordered nano-structure (such as that of hydrated Portland cement), (ii) tail probability of a fiber bundle (or parallel coupling) model with softening elements, (iii) convergence of this model to the Gaussian distribution, (iv) the stress-life curve under constant load, and (v) a detailed random walk analysis of crack front jumps in an atomic lattice. The nonlocal behavior is captured in the present theory through the finiteness of the number of links in the weakest-link model, which explains why the mean size effect coincides with that of the previously formulated nonlocal Weibull theory. Brittle structures correspond to the large-size limit of the present theory. An important practical conclusion is that the safety factors for strength and tolerable minimum lifetime for large quasibrittle structures (e.g., concrete structures and composite airframes or ship hulls, as well as various micro-devices) should be calculated as a function of structure size and geometry.

  17. NEWTONP - CUMULATIVE BINOMIAL PROGRAMS

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.

  18. Differential effects of insular and ventromedial prefrontal cortex lesions on risky decision-making

    PubMed Central

    Bechara, A.; Damasio, H.; Aitken, M. R. F.; Sahakian, B. J.; Robbins, T. W.

    2008-01-01

    The ventromedial prefrontal cortex (vmPFC) and insular cortex are implicated in distributed neural circuitry that supports emotional decision-making. Previous studies of patients with vmPFC lesions have focused primarily on decision-making under uncertainty, when outcome probabilities are ambiguous (e.g. the Iowa Gambling Task). It remains unclear whether vmPFC is also necessary for decision-making under risk, when outcome probabilities are explicit. It is not known whether the effect of insular damage is analogous to the effect of vmPFC damage, or whether these regions contribute differentially to choice behaviour. Four groups of participants were compared on the Cambridge Gamble Task, a well-characterized measure of risky decision-making where outcome probabilities are presented explicitly, thus minimizing additional learning and working memory demands. Patients with focal, stable lesions to the vmPFC (n = 20) and the insular cortex (n = 13) were compared against healthy subjects (n = 41) and a group of lesion controls (n = 12) with damage predominantly affecting the dorsal and lateral frontal cortex. The vmPFC and insular cortex patients showed selective and distinctive disruptions of betting behaviour. VmPFC damage was associated with increased betting regardless of the odds of winning, consistent with a role of vmPFC in biasing healthy individuals towards conservative options under risk. In contrast, patients with insular cortex lesions failed to adjust their bets by the odds of winning, consistent with a role of the insular cortex in signalling the probability of aversive outcomes. The insular group attained a lower point score on the task and experienced more ‘bankruptcies’. There were no group differences in probability judgement. These data confirm the necessary role of the vmPFC and insular regions in decision-making under risk. Poor decision-making in clinical populations can arise via multiple routes, with functionally dissociable effects of vmPFC and insular cortex damage. PMID:18390562

  19. Differential effects of insular and ventromedial prefrontal cortex lesions on risky decision-making.

    PubMed

    Clark, L; Bechara, A; Damasio, H; Aitken, M R F; Sahakian, B J; Robbins, T W

    2008-05-01

    The ventromedial prefrontal cortex (vmPFC) and insular cortex are implicated in distributed neural circuitry that supports emotional decision-making. Previous studies of patients with vmPFC lesions have focused primarily on decision-making under uncertainty, when outcome probabilities are ambiguous (e.g. the Iowa Gambling Task). It remains unclear whether vmPFC is also necessary for decision-making under risk, when outcome probabilities are explicit. It is not known whether the effect of insular damage is analogous to the effect of vmPFC damage, or whether these regions contribute differentially to choice behaviour. Four groups of participants were compared on the Cambridge Gamble Task, a well-characterized measure of risky decision-making where outcome probabilities are presented explicitly, thus minimizing additional learning and working memory demands. Patients with focal, stable lesions to the vmPFC (n = 20) and the insular cortex (n = 13) were compared against healthy subjects (n = 41) and a group of lesion controls (n = 12) with damage predominantly affecting the dorsal and lateral frontal cortex. The vmPFC and insular cortex patients showed selective and distinctive disruptions of betting behaviour. VmPFC damage was associated with increased betting regardless of the odds of winning, consistent with a role of vmPFC in biasing healthy individuals towards conservative options under risk. In contrast, patients with insular cortex lesions failed to adjust their bets by the odds of winning, consistent with a role of the insular cortex in signalling the probability of aversive outcomes. The insular group attained a lower point score on the task and experienced more 'bankruptcies'. There were no group differences in probability judgement. These data confirm the necessary role of the vmPFC and insular regions in decision-making under risk. Poor decision-making in clinical populations can arise via multiple routes, with functionally dissociable effects of vmPFC and insular cortex damage.

  20. Advanced probabilistic methods for quantifying the effects of various uncertainties in structural response

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.

    1988-01-01

    The effects of actual variations, also called uncertainties, in geometry and material properties on the structural response of a space shuttle main engine turbopump blade are evaluated. A normal distribution was assumed to represent the uncertainties statistically. Uncertainties were assumed to be totally random, partially correlated, and fully correlated. The magnitude of these uncertainties were represented in terms of mean and variance. Blade responses, recorded in terms of displacements, natural frequencies, and maximum stress, was evaluated and plotted in the form of probabilistic distributions under combined uncertainties. These distributions provide an estimate of the range of magnitudes of the response and probability of occurrence of a given response. Most importantly, these distributions provide the information needed to estimate quantitatively the risk in a structural design.

  1. Statistics of single unit responses in the human medial temporal lobe: A sparse and overdispersed code

    NASA Astrophysics Data System (ADS)

    Magyar, Andrew

    The recent discovery of cells that respond to purely conceptual features of the environment (particular people, landmarks, objects, etc) in the human medial temporal lobe (MTL), has raised many questions about the nature of the neural code in humans. The goal of this dissertation is to develop a novel statistical method based upon maximum likelihood regression which will then be applied to these experiments in order to produce a quantitative description of the coding properties of the human MTL. In general, the method is applicable to any experiments in which a sequence of stimuli are presented to an organism while the binary responses of a large number of cells are recorded in parallel. The central concept underlying the approach is the total probability that a neuron responds to a random stimulus, called the neuronal sparsity. The model then estimates the distribution of response probabilities across the population of cells. Applying the method to single-unit recordings from the human medial temporal lobe, estimates of the sparsity distributions are acquired in four regions: the hippocampus, the entorhinal cortex, the amygdala, and the parahippocampal cortex. The resulting distributions are found to be sparse (large fraction of cells with a low response probability) and highly non-uniform, with a large proportion of ultra-sparse neurons that possess a very low response probability, and a smaller population of cells which respond much more frequently. Rammifications of the results are discussed in relation to the sparse coding hypothesis, and comparisons are made between the statistics of the human medial temporal lobe cells and place cells observed in the rodent hippocampus.

  2. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    USGS Publications Warehouse

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving more confidence in the estimate, and we use those data to calculate the annual probability of a large eruption in the next year at 1.4x10-5.

  3. Burst wait time simulation of CALIBAN reactor at delayed super-critical state

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humbert, P.; Authier, N.; Richard, B.

    2012-07-01

    In the past, the super prompt critical wait time probability distribution was measured on CALIBAN fast burst reactor [4]. Afterwards, these experiments were simulated with a very good agreement by solving the non-extinction probability equation [5]. Recently, the burst wait time probability distribution has been measured at CEA-Valduc on CALIBAN at different delayed super-critical states [6]. However, in the delayed super-critical case the non-extinction probability does not give access to the wait time distribution. In this case it is necessary to compute the time dependent evolution of the full neutron count number probability distribution. In this paper we present themore » point model deterministic method used to calculate the probability distribution of the wait time before a prescribed count level taking into account prompt neutrons and delayed neutron precursors. This method is based on the solution of the time dependent adjoint Kolmogorov master equations for the number of detections using the generating function methodology [8,9,10] and inverse discrete Fourier transforms. The obtained results are then compared to the measurements and Monte-Carlo calculations based on the algorithm presented in [7]. (authors)« less

  4. Steady state, relaxation and first-passage properties of a run-and-tumble particle in one-dimension

    NASA Astrophysics Data System (ADS)

    Malakar, Kanaya; Jemseena, V.; Kundu, Anupam; Vijay Kumar, K.; Sabhapandit, Sanjib; Majumdar, Satya N.; Redner, S.; Dhar, Abhishek

    2018-04-01

    We investigate the motion of a run-and-tumble particle (RTP) in one dimension. We find the exact probability distribution of the particle with and without diffusion on the infinite line, as well as in a finite interval. In the infinite domain, this probability distribution approaches a Gaussian form in the long-time limit, as in the case of a regular Brownian particle. At intermediate times, this distribution exhibits unexpected multi-modal forms. In a finite domain, the probability distribution reaches a steady-state form with peaks at the boundaries, in contrast to a Brownian particle. We also study the relaxation to the steady-state analytically. Finally we compute the survival probability of the RTP in a semi-infinite domain with an absorbing boundary condition at the origin. In the finite interval, we compute the exit probability and the associated exit times. We provide numerical verification of our analytical results.

  5. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  6. University Students' Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics.

    PubMed

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution-the central, unifying, and overarching theme in biology. Aspects strongly related to abstract "threshold" concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument for assessing students' conceptual knowledge of randomness and probability in the context of evolution. To address this problem, we have developed two instruments, Ra ndomness and Pro bability Test in the Context of Evo lution (RaProEvo) and Ra ndomness and Pro bability Test in the Context of Math ematics (RaProMath), that include both multiple-choice and free-response items. The instruments were administered to 140 university students in Germany, then the Rasch partial-credit model was applied to assess them. The results indicate that the instruments generate reliable and valid inferences about students' conceptual knowledge of randomness and probability in the two contexts (which are separable competencies). Furthermore, RaProEvo detected significant differences in knowledge of randomness and probability, as well as evolutionary theory, between biology majors and preservice biology teachers. © 2017 D. Fiedler et al. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  7. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    ERIC Educational Resources Information Center

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  8. Will current probabilistic climate change information, as such, improve adaptation?

    NASA Astrophysics Data System (ADS)

    Lopez, A.; Smith, L. A.

    2012-04-01

    Probabilistic climate scenarios are currently being provided to end users, to employ as probabilities in adaptation decision making, with the explicit suggestion that they quantify the impacts of climate change relevant to a variety of sectors. These "probabilities" are, however, rather sensitive to the assumptions in, and the structure of the modelling approaches used to generate them. It is often argued that stakeholders require probabilistic climate change information to adequately evaluate and plan adaptation pathways. On the other hand, some circumstantial evidence suggests that on the ground decision making rarely uses well defined probability distributions of climate change as inputs. Nevertheless it is within this context of probability distributions of climate change that we discuss possible drawbacks of supplying information that, while presented as robust and decision relevant, , is in fact unlikely to be so due to known flaws both in the underlying models and in the methodology used to "account for" those known flaws. How might one use a probability forecast that is expected to change in the future, not due to a refinement in our information but due to fundamental flaws in its construction? What then are the alternatives? While the answer will depend on the context of the problem at hand, a good approach will be strongly informed by the timescale of the given planning decision, and the consideration of all the non-climatic factors that have to be taken into account in the corresponding risk assessment. Using a water resources system as an example, we illustrate an alternative approach to deal with these challenges and make robust adaptation decisions today.

  9. Nested Sampling for Bayesian Model Comparison in the Context of Salmonella Disease Dynamics

    PubMed Central

    Dybowski, Richard; McKinley, Trevelyan J.; Mastroeni, Pietro; Restif, Olivier

    2013-01-01

    Understanding the mechanisms underlying the observed dynamics of complex biological systems requires the statistical assessment and comparison of multiple alternative models. Although this has traditionally been done using maximum likelihood-based methods such as Akaike's Information Criterion (AIC), Bayesian methods have gained in popularity because they provide more informative output in the form of posterior probability distributions. However, comparison between multiple models in a Bayesian framework is made difficult by the computational cost of numerical integration over large parameter spaces. A new, efficient method for the computation of posterior probabilities has recently been proposed and applied to complex problems from the physical sciences. Here we demonstrate how nested sampling can be used for inference and model comparison in biological sciences. We present a reanalysis of data from experimental infection of mice with Salmonella enterica showing the distribution of bacteria in liver cells. In addition to confirming the main finding of the original analysis, which relied on AIC, our approach provides: (a) integration across the parameter space, (b) estimation of the posterior parameter distributions (with visualisations of parameter correlations), and (c) estimation of the posterior predictive distributions for goodness-of-fit assessments of the models. The goodness-of-fit results suggest that alternative mechanistic models and a relaxation of the quasi-stationary assumption should be considered. PMID:24376528

  10. An Optimization-Based Framework for the Transformation of Incomplete Biological Knowledge into a Probabilistic Structure and Its Application to the Utilization of Gene/Protein Signaling Pathways in Discrete Phenotype Classification.

    PubMed

    Esfahani, Mohammad Shahrokh; Dougherty, Edward R

    2015-01-01

    Phenotype classification via genomic data is hampered by small sample sizes that negatively impact classifier design. Utilization of prior biological knowledge in conjunction with training data can improve both classifier design and error estimation via the construction of the optimal Bayesian classifier. In the genomic setting, gene/protein signaling pathways provide a key source of biological knowledge. Although these pathways are neither complete, nor regulatory, with no timing associated with them, they are capable of constraining the set of possible models representing the underlying interaction between molecules. The aim of this paper is to provide a framework and the mathematical tools to transform signaling pathways to prior probabilities governing uncertainty classes of feature-label distributions used in classifier design. Structural motifs extracted from the signaling pathways are mapped to a set of constraints on a prior probability on a Multinomial distribution. Being the conjugate prior for the Multinomial distribution, we propose optimization paradigms to estimate the parameters of a Dirichlet distribution in the Bayesian setting. The performance of the proposed methods is tested on two widely studied pathways: mammalian cell cycle and a p53 pathway model.

  11. Probability distributions of the electroencephalogram envelope of preterm infants.

    PubMed

    Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro

    2015-06-01

    To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Flood Frequency Curves - Use of information on the likelihood of extreme floods

    NASA Astrophysics Data System (ADS)

    Faber, B.

    2011-12-01

    Investment in the infrastructure that reduces flood risk for flood-prone communities must incorporate information on the magnitude and frequency of flooding in that area. Traditionally, that information has been a probability distribution of annual maximum streamflows developed from the historical gaged record at a stream site. Practice in the United States fits a Log-Pearson type3 distribution to the annual maximum flows of an unimpaired streamflow record, using the method of moments to estimate distribution parameters. The procedure makes the assumptions that annual peak streamflow events are (1) independent, (2) identically distributed, and (3) form a representative sample of the overall probability distribution. Each of these assumptions can be challenged. We rarely have enough data to form a representative sample, and therefore must compute and display the uncertainty in the estimated flood distribution. But, is there a wet/dry cycle that makes precipitation less than independent between successive years? Are the peak flows caused by different types of events from different statistical populations? How does the watershed or climate changing over time (non-stationarity) affect the probability distribution floods? Potential approaches to avoid these assumptions vary from estimating trend and shift and removing them from early data (and so forming a homogeneous data set), to methods that estimate statistical parameters that vary with time. A further issue in estimating a probability distribution of flood magnitude (the flood frequency curve) is whether a purely statistical approach can accurately capture the range and frequency of floods that are of interest. A meteorologically-based analysis produces "probable maximum precipitation" (PMP) and subsequently a "probable maximum flood" (PMF) that attempts to describe an upper bound on flood magnitude in a particular watershed. This analysis can help constrain the upper tail of the probability distribution, well beyond the range of gaged data or even historical or paleo-flood data, which can be very important in risk analyses performed for flood risk management and dam and levee safety studies.

  13. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  14. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  15. Characteristics of service requests and service processes of fire and rescue service dispatch centers: analysis of real world data and the underlying probability distributions.

    PubMed

    Krueger, Ute; Schimmelpfeng, Katja

    2013-03-01

    A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.

  16. Optimization of European call options considering physical delivery network and reservoir operation rules

    NASA Astrophysics Data System (ADS)

    Cheng, Wei-Chen; Hsu, Nien-Sheng; Cheng, Wen-Ming; Yeh, William W.-G.

    2011-10-01

    This paper develops alternative strategies for European call options for water purchase under hydrological uncertainties that can be used by water resources managers for decision making. Each alternative strategy maximizes its own objective over a selected sequence of future hydrology that is characterized by exceedance probability. Water trade provides flexibility and enhances water distribution system reliability. However, water trade between two parties in a regional water distribution system involves many issues, such as delivery network, reservoir operation rules, storage space, demand, water availability, uncertainty, and any existing contracts. An option is a security giving the right to buy or sell an asset; in our case, the asset is water. We extend a flow path-based water distribution model to include reservoir operation rules. The model simultaneously considers both the physical distribution network as well as the relationships between water sellers and buyers. We first test the model extension. Then we apply the proposed optimization model for European call options to the Tainan water distribution system in southern Taiwan. The formulation lends itself to a mixed integer linear programming model. We use the weighing method to formulate a composite function for a multiobjective problem. The proposed methodology provides water resources managers with an overall picture of water trade strategies and the consequence of each strategy. The results from the case study indicate that the strategy associated with a streamflow exceedence probability of 50% or smaller should be adopted as the reference strategy for the Tainan water distribution system.

  17. Mathematical Model to estimate the wind power using four-parameter Burr distribution

    NASA Astrophysics Data System (ADS)

    Liu, Sanming; Wang, Zhijie; Pan, Zhaoxu

    2018-03-01

    When the real probability of wind speed in the same position needs to be described, the four-parameter Burr distribution is more suitable than other distributions. This paper introduces its important properties and characteristics. Also, the application of the four-parameter Burr distribution in wind speed prediction is discussed, and the expression of probability distribution of output power of wind turbine is deduced.

  18. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  19. Historical emissions critical for mapping decarbonization pathways

    NASA Astrophysics Data System (ADS)

    Majkut, J.; Kopp, R. E.; Sarmiento, J. L.; Oppenheimer, M.

    2016-12-01

    Policymakers have set a goal of limiting temperature increase from human influence on the climate. This motivates the identification of decarbonization pathways to stabilize atmospheric concentrations of CO2. In this context, the future behavior of CO2 sources and sinks define the CO2 emissions necessary to meet warming thresholds with specified probabilities. We adopt a simple model of the atmosphere-land-ocean carbon balance to reflect uncertainty in how natural CO2 sinks will respond to increasing atmospheric CO2 and temperature. Bayesian inversion is used to estimate the probability distributions of selected parameters of the carbon model. Prior probability distributions are chosen to reflect the behavior of CMIP5 models. We then update these prior distributions by running historical simulations of the global carbon cycle and inverting with observationally-based inventories and fluxes of anthropogenic carbon in the ocean and atmosphere. The result is a best-estimate of historical CO2 sources and sinks and a model of how CO2 sources and sinks will vary in the future under various emissions scenarios, with uncertainty. By linking the carbon model to a simple climate model, we calculate emissions pathways and carbon budgets consistent with meeting specific temperature thresholds and identify key factors that contribute to remaining uncertainty. In particular, we show how the assumed history of CO2 emissions from land use change (LUC) critically impacts estimates of the strength of the land CO2 sink via CO2 fertilization. Different estimates of historical LUC emissions taken from the literature lead to significantly different parameterizations of the carbon system. High historical CO2 emissions from LUC lead to a more robust CO2 fertilization effect, significantly lower future atmospheric CO2 concentrations, and an increased amount of CO2 that can be emitted to satisfy temperature stabilization targets. Thus, in our model, historical LUC emissions have a significant impact on allowable carbon budgets under temperture targets.

  20. Likelihood analysis of species occurrence probability from presence-only data for modelling species distributions

    USGS Publications Warehouse

    Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.

    2012-01-01

    1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities instead of vaguely defined indices.

  1. Comparison of Traditional and Open-Access Appointment Scheduling for Exponentially Distributed Service Time.

    PubMed

    Yan, Chongjun; Tang, Jiafu; Jiang, Bowen; Fung, Richard Y K

    2015-01-01

    This paper compares the performance measures of traditional appointment scheduling (AS) with those of an open-access appointment scheduling (OA-AS) system with exponentially distributed service time. A queueing model is formulated for the traditional AS system with no-show probability. The OA-AS models assume that all patients who call before the session begins will show up for the appointment on time. Two types of OA-AS systems are considered: with a same-session policy and with a same-or-next-session policy. Numerical results indicate that the superiority of OA-AS systems is not as obvious as those under deterministic scenarios. The same-session system has a threshold of relative waiting cost, after which the traditional system always has higher total costs, and the same-or-next-session system is always preferable, except when the no-show probability or the weight of patients' waiting is low. It is concluded that open-access policies can be viewed as alternative approaches to mitigate the negative effects of no-show patients.

  2. A multi-agent intelligent environment for medical knowledge.

    PubMed

    Vicari, Rosa M; Flores, Cecilia D; Silvestre, André M; Seixas, Louise J; Ladeira, Marcelo; Coelho, Helder

    2003-03-01

    AMPLIA is a multi-agent intelligent learning environment designed to support training of diagnostic reasoning and modelling of domains with complex and uncertain knowledge. AMPLIA focuses on the medical area. It is a system that deals with uncertainty under the Bayesian network approach, where learner-modelling tasks will consist of creating a Bayesian network for a problem the system will present. The construction of a network involves qualitative and quantitative aspects. The qualitative part concerns the network topology, that is, causal relations among the domain variables. After it is ready, the quantitative part is specified. It is composed of the distribution of conditional probability of the variables represented. A negotiation process (managed by an intelligent MediatorAgent) will treat the differences of topology and probability distribution between the model the learner built and the one built-in in the system. That negotiation process occurs between the agents that represent the expert knowledge domain (DomainAgent) and the agent that represents the learner knowledge (LearnerAgent).

  3. Estimates of the low-level wind shear and turbulence in the vicinity of Kennedy International Airport on 24 June 1975

    NASA Technical Reports Server (NTRS)

    Lewellen, W. S.; Williamson, G. G.

    1976-01-01

    A study was conducted to estimate the type of wind and turbulence distributions which may have existed at the time of the crash of Eastern Airlines Flight 66 while attempting to land. A number of different wind and turbulence profiles are predicted for the site and date of the crash. The morning and mid-afternoon predictions are in reasonably good agreement with magnitude and direction as reported by the weather observer. Although precise predictions cannot be made during the passage of the thunderstorm which coincides with the time of the accident, a number of different profiles which might exist under or in the vicinity of a thunderstorm are presented. The profile that is most probable predicts the mean headwind shear over 100 m (300 feet) altitude change and the average fluctuations about the mean headwind distribution. This combination of means and fluctuations leads to a reasonable probability that the instantaneous headwind shear would equal the maximum value reported in the flight recorder data.

  4. Patterns of a spatial exploration under time evolution of the attractiveness: Persistent nodes, degree distribution, and spectral properties

    NASA Astrophysics Data System (ADS)

    da Silva, Roberto

    2018-06-01

    This work explores the features of a graph generated by agents that hop from one node to another node, where the nodes have evolutionary attractiveness. The jumps are governed by Boltzmann-like transition probabilities that depend both on the euclidean distance between the nodes and on the ratio (β) of the attractiveness between them. It is shown that persistent nodes, i.e., nodes that never been reached by this special random walk are possible in the stationary limit differently from the case where the attractiveness is fixed and equal to one for all nodes (β = 1). Simultaneously, one also investigates the spectral properties and statistics related to the attractiveness and degree distribution of the evolutionary network. Finally, a study of the crossover between persistent phase and no persistent phase was performed and it was also observed the existence of a special type of transition probability which leads to a power law behaviour for the time evolution of the persistence.

  5. Bayesian predictive power: choice of prior and some recommendations for its use as probability of success in drug development.

    PubMed

    Rufibach, Kaspar; Burger, Hans Ulrich; Abt, Markus

    2016-09-01

    Bayesian predictive power, the expectation of the power function with respect to a prior distribution for the true underlying effect size, is routinely used in drug development to quantify the probability of success of a clinical trial. Choosing the prior is crucial for the properties and interpretability of Bayesian predictive power. We review recommendations on the choice of prior for Bayesian predictive power and explore its features as a function of the prior. The density of power values induced by a given prior is derived analytically and its shape characterized. We find that for a typical clinical trial scenario, this density has a u-shape very similar, but not equal, to a β-distribution. Alternative priors are discussed, and practical recommendations to assess the sensitivity of Bayesian predictive power to its input parameters are provided. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Relating centrality to impact parameter in nucleus-nucleus collisions

    NASA Astrophysics Data System (ADS)

    Das, Sruthy Jyothi; Giacalone, Giuliano; Monard, Pierre-Amaury; Ollitrault, Jean-Yves

    2018-01-01

    In ultrarelativistic heavy-ion experiments, one estimates the centrality of a collision by using a single observable, say n , typically given by the transverse energy or the number of tracks observed in a dedicated detector. The correlation between n and the impact parameter b of the collision is then inferred by fitting a specific model of the collision dynamics, such as the Glauber model, to experimental data. The goal of this paper is to assess precisely which information about b can be extracted from data without any specific model of the collision. Under the sole assumption that the probability distribution of n for a fixed b is Gaussian, we show that the probability distribution of the impact parameter in a narrow centrality bin can be accurately reconstructed up to 5 % centrality. We apply our methodology to data from the Relativistic Heavy Ion Collider and the Large Hadron Collider. We propose a simple measure of the precision of the centrality determination, which can be used to compare different experiments.

  7. Robustness of survival estimates for radio-marked animals

    USGS Publications Warehouse

    Bunck, C.M.; Chen, C.-L.

    1992-01-01

    Telemetry techniques are often used to study the survival of birds and mammals; particularly whcn mark-recapture approaches are unsuitable. Both parametric and nonparametric methods to estimate survival have becn developed or modified from other applications. An implicit assumption in these approaches is that the probability of re-locating an animal with a functioning transmitter is one. A Monte Carlo study was conducted to determine the bias and variance of the Kaplan-Meier estimator and an estimator based also on the assumption of constant hazard and to eva!uate the performance of the two-sample tests associated with each. Modifications of each estimator which allow a re-Iocation probability of less than one are described and evaluated. Generallv the unmodified estimators were biased but had lower variance. At low sample sizes all estimators performed poorly. Under the null hypothesis, the distribution of all test statistics reasonably approximated the null distribution when survival was low but not when it was high. The power of the two-sample tests were similar.

  8. Breakdown of the classical description of a local system.

    PubMed

    Kot, Eran; Grønbech-Jensen, Niels; Nielsen, Bo M; Neergaard-Nielsen, Jonas S; Polzik, Eugene S; Sørensen, Anders S

    2012-06-08

    We provide a straightforward demonstration of a fundamental difference between classical and quantum mechanics for a single local system: namely, the absence of a joint probability distribution of the position x and momentum p. Elaborating on a recently reported criterion by Bednorz and Belzig [Phys. Rev. A 83, 052113 (2011)] we derive a simple criterion that must be fulfilled for any joint probability distribution in classical physics. We demonstrate the violation of this criterion using the homodyne measurement of a single photon state, thus proving a straightforward signature of the breakdown of a classical description of the underlying state. Most importantly, the criterion used does not rely on quantum mechanics and can thus be used to demonstrate nonclassicality of systems not immediately apparent to exhibit quantum behavior. The criterion is directly applicable to any system described by the continuous canonical variables x and p, such as a mechanical or an electrical oscillator and a collective spin of a large ensemble.

  9. Local linear estimation of concordance probability with application to covariate effects models on association for bivariate failure-time data.

    PubMed

    Ding, Aidong Adam; Hsieh, Jin-Jian; Wang, Weijing

    2015-01-01

    Bivariate survival analysis has wide applications. In the presence of covariates, most literature focuses on studying their effects on the marginal distributions. However covariates can also affect the association between the two variables. In this article we consider the latter issue by proposing a nonstandard local linear estimator for the concordance probability as a function of covariates. Under the Clayton copula, the conditional concordance probability has a simple one-to-one correspondence with the copula parameter for different data structures including those subject to independent or dependent censoring and dependent truncation. The proposed method can be used to study how covariates affect the Clayton association parameter without specifying marginal regression models. Asymptotic properties of the proposed estimators are derived and their finite-sample performances are examined via simulations. Finally, for illustration, we apply the proposed method to analyze a bone marrow transplant data set.

  10. Mode switching in volcanic seismicity: El Hierro 2011-2013

    NASA Astrophysics Data System (ADS)

    Roberts, Nick S.; Bell, Andrew F.; Main, Ian G.

    2016-05-01

    The Gutenberg-Richter b value is commonly used in volcanic eruption forecasting to infer material or mechanical properties from earthquake distributions. Such studies typically analyze discrete time windows or phases, but the choice of such windows is subjective and can introduce significant bias. Here we minimize this sample bias by iteratively sampling catalogs with randomly chosen windows and then stack the resulting probability density functions for the estimated b>˜ value to determine a net probability density function. We examine data from the El Hierro seismic catalog during a period of unrest in 2011-2013 and demonstrate clear multimodal behavior. Individual modes are relatively stable in time, but the most probable b>˜ value intermittently switches between modes, one of which is similar to that of tectonic seismicity. Multimodality is primarily associated with intermittent activation and cessation of activity in different parts of the volcanic system rather than with respect to any systematic inferred underlying process.

  11. A Local-Realistic Model of Quantum Mechanics Based on a Discrete Spacetime

    NASA Astrophysics Data System (ADS)

    Sciarretta, Antonio

    2018-01-01

    This paper presents a realistic, stochastic, and local model that reproduces nonrelativistic quantum mechanics (QM) results without using its mathematical formulation. The proposed model only uses integer-valued quantities and operations on probabilities, in particular assuming a discrete spacetime under the form of a Euclidean lattice. Individual (spinless) particle trajectories are described as random walks. Transition probabilities are simple functions of a few quantities that are either randomly associated to the particles during their preparation, or stored in the lattice nodes they visit during the walk. QM predictions are retrieved as probability distributions of similarly-prepared ensembles of particles. The scenarios considered to assess the model comprise of free particle, constant external force, harmonic oscillator, particle in a box, the Delta potential, particle on a ring, particle on a sphere and include quantization of energy levels and angular momentum, as well as momentum entanglement.

  12. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  13. Employing Sensitivity Derivatives for Robust Optimization under Uncertainty in CFD

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Putko, Michele M.; Taylor, Arthur C., III

    2004-01-01

    A robust optimization is demonstrated on a two-dimensional inviscid airfoil problem in subsonic flow. Given uncertainties in statistically independent, random, normally distributed flow parameters (input variables), an approximate first-order statistical moment method is employed to represent the Computational Fluid Dynamics (CFD) code outputs as expected values with variances. These output quantities are used to form the objective function and constraints. The constraints are cast in probabilistic terms; that is, the probability that a constraint is satisfied is greater than or equal to some desired target probability. Gradient-based robust optimization of this stochastic problem is accomplished through use of both first and second-order sensitivity derivatives. For each robust optimization, the effect of increasing both input standard deviations and target probability of constraint satisfaction are demonstrated. This method provides a means for incorporating uncertainty when considering small deviations from input mean values.

  14. Work probability distribution for a ferromagnet with long-ranged and short-ranged correlations

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, J. K.; Kirkpatrick, T. R.; Sengers, J. V.

    2018-04-01

    Work fluctuations and work probability distributions are fundamentally different in systems with short-ranged versus long-ranged correlations. Specifically, in systems with long-ranged correlations the work distribution is extraordinarily broad compared to systems with short-ranged correlations. This difference profoundly affects the possible applicability of fluctuation theorems like the Jarzynski fluctuation theorem. The Heisenberg ferromagnet, well below its Curie temperature, is a system with long-ranged correlations in very low magnetic fields due to the presence of Goldstone modes. As the magnetic field is increased the correlations gradually become short ranged. Hence, such a ferromagnet is an ideal system for elucidating the changes of the work probability distribution as one goes from a domain with long-ranged correlations to a domain with short-ranged correlations by tuning the magnetic field. A quantitative analysis of this crossover behavior of the work probability distribution and the associated fluctuations is presented.

  15. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  16. Methods to elicit probability distributions from experts: a systematic review of reported practice in health technology assessment.

    PubMed

    Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken

    2013-11-01

    Elicitation is a technique that can be used to obtain probability distribution from experts about unknown quantities. We conducted a methodology review of reports where probability distributions had been elicited from experts to be used in model-based health technology assessments. Databases including MEDLINE, EMBASE and the CRD database were searched from inception to April 2013. Reference lists were checked and citation mapping was also used. Studies describing their approach to the elicitation of probability distributions were included. Data was abstracted on pre-defined aspects of the elicitation technique. Reports were critically appraised on their consideration of the validity, reliability and feasibility of the elicitation exercise. Fourteen articles were included. Across these studies, the most marked features were heterogeneity in elicitation approach and failure to report key aspects of the elicitation method. The most frequently used approaches to elicitation were the histogram technique and the bisection method. Only three papers explicitly considered the validity, reliability and feasibility of the elicitation exercises. Judged by the studies identified in the review, reports of expert elicitation are insufficient in detail and this impacts on the perceived usability of expert-elicited probability distributions. In this context, the wider credibility of elicitation will only be improved by better reporting and greater standardisation of approach. Until then, the advantage of eliciting probability distributions from experts may be lost.

  17. A mass reconstruction technique for a heavy resonance decaying to τ + τ -

    NASA Astrophysics Data System (ADS)

    Xia, Li-Gang

    2016-11-01

    For a resonance decaying to τ + τ -, it is difficult to reconstruct its mass accurately because of the presence of neutrinos in the decay products of the τ leptons. If the resonance is heavy enough, we show that its mass can be well determined by the momentum component of the τ decay products perpendicular to the velocity of the τ lepton, p ⊥, and the mass of the visible/invisible decay products, m vis/inv, for τ decaying to hadrons/leptons. By sampling all kinematically allowed values of p ⊥ and m vis/inv according to their joint probability distributions determined by the MC simulations, the mass of the mother resonance is assumed to lie at the position with the maximal probability. Since p ⊥ and m vis/inv are invariant under the boost in the τ lepton direction, the joint probability distributions are independent upon the τ’s origin. Thus this technique is able to determine the mass of an unknown resonance with no efficiency loss. It is tested using MC simulations of the physics processes pp → Z/h(125)/h(750) + X → ττ + X at 13 TeV. The ratio of the full width at half maximum and the peak value of the reconstructed mass distribution is found to be 20%-40% using the information of missing transverse energy. Supported by General Financial Grant from the China Postdoctoral Science Foundation (2015M581062)

  18. Predictions from star formation in the multiverse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bousso, Raphael; Leichenauer, Stefan

    2010-03-15

    We compute trivariate probability distributions in the landscape, scanning simultaneously over the cosmological constant, the primordial density contrast, and spatial curvature. We consider two different measures for regulating the divergences of eternal inflation, and three different models for observers. In one model, observers are assumed to arise in proportion to the entropy produced by stars; in the others, they arise at a fixed time (5 or 10x10{sup 9} years) after star formation. The star formation rate, which underlies all our observer models, depends sensitively on the three scanning parameters. We employ a recently developed model of star formation in themore » multiverse, a considerable refinement over previous treatments of the astrophysical and cosmological properties of different pocket universes. For each combination of observer model and measure, we display all single and bivariate probability distributions, both with the remaining parameter(s) held fixed and marginalized. Our results depend only weakly on the observer model but more strongly on the measure. Using the causal diamond measure, the observed parameter values (or bounds) lie within the central 2{sigma} of nearly all probability distributions we compute, and always within 3{sigma}. This success is encouraging and rather nontrivial, considering the large size and dimension of the parameter space. The causal patch measure gives similar results as long as curvature is negligible. If curvature dominates, the causal patch leads to a novel runaway: it prefers a negative value of the cosmological constant, with the smallest magnitude available in the landscape.« less

  19. Partitioning into hazard subregions for regional peaks-over-threshold modeling of heavy precipitation

    NASA Astrophysics Data System (ADS)

    Carreau, J.; Naveau, P.; Neppel, L.

    2017-05-01

    The French Mediterranean is subject to intense precipitation events occurring mostly in autumn. These can potentially cause flash floods, the main natural danger in the area. The distribution of these events follows specific spatial patterns, i.e., some sites are more likely to be affected than others. The peaks-over-threshold approach consists in modeling extremes, such as heavy precipitation, by the generalized Pareto (GP) distribution. The shape parameter of the GP controls the probability of extreme events and can be related to the hazard level of a given site. When interpolating across a region, the shape parameter should reproduce the observed spatial patterns of the probability of heavy precipitation. However, the shape parameter estimators have high uncertainty which might hide the underlying spatial variability. As a compromise, we choose to let the shape parameter vary in a moderate fashion. More precisely, we assume that the region of interest can be partitioned into subregions with constant hazard level. We formalize the model as a conditional mixture of GP distributions. We develop a two-step inference strategy based on probability weighted moments and put forward a cross-validation procedure to select the number of subregions. A synthetic data study reveals that the inference strategy is consistent and not very sensitive to the selected number of subregions. An application on daily precipitation data from the French Mediterranean shows that the conditional mixture of GPs outperforms two interpolation approaches (with constant or smoothly varying shape parameter).

  20. Maximum Entropy Principle for Transportation

    NASA Astrophysics Data System (ADS)

    Bilich, F.; DaSilva, R.

    2008-11-01

    In this work we deal with modeling of the transportation phenomenon for use in the transportation planning process and policy-impact studies. The model developed is based on the dependence concept, i.e., the notion that the probability of a trip starting at origin i is dependent on the probability of a trip ending at destination j given that the factors (such as travel time, cost, etc.) which affect travel between origin i and destination j assume some specific values. The derivation of the solution of the model employs the maximum entropy principle combining a priori multinomial distribution with a trip utility concept. This model is utilized to forecast trip distributions under a variety of policy changes and scenarios. The dependence coefficients are obtained from a regression equation where the functional form is derived based on conditional probability and perception of factors from experimental psychology. The dependence coefficients encode all the information that was previously encoded in the form of constraints. In addition, the dependence coefficients encode information that cannot be expressed in the form of constraints for practical reasons, namely, computational tractability. The equivalence between the standard formulation (i.e., objective function with constraints) and the dependence formulation (i.e., without constraints) is demonstrated. The parameters of the dependence-based trip-distribution model are estimated, and the model is also validated using commercial air travel data in the U.S. In addition, policy impact analyses (such as allowance of supersonic flights inside the U.S. and user surcharge at noise-impacted airports) on air travel are performed.

  1. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  2. Continuous-time quantum walks on star graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salimi, S.

    2009-06-15

    In this paper, we investigate continuous-time quantum walk on star graphs. It is shown that quantum central limit theorem for a continuous-time quantum walk on star graphs for N-fold star power graph, which are invariant under the quantum component of adjacency matrix, converges to continuous-time quantum walk on K{sub 2} graphs (complete graph with two vertices) and the probability of observing walk tends to the uniform distribution.

  3. On a perturbed Sparre Andersen risk model with multi-layer dividend strategy

    NASA Astrophysics Data System (ADS)

    Yang, Hu; Zhang, Zhimin

    2009-10-01

    In this paper, we consider a perturbed Sparre Andersen risk model, in which the inter-claim times are generalized Erlang(n) distributed. Under the multi-layer dividend strategy, piece-wise integro-differential equations for the discounted penalty functions are derived, and a recursive approach is applied to express the solutions. A numerical example to calculate the ruin probabilities is given to illustrate the solution procedure.

  4. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  5. Distribution of leached radioactive material in the Legin Group Area, San Miguel County, Colorado

    USGS Publications Warehouse

    Rogers, Allen S.

    1950-01-01

    Radioactivity anomalies, which are small in magnitude, and probably are not caused by extensions of known uranium-vanadium ore bodies, were detected during the gamma-ray logging of diamond-drill holes in the Legin group of claims, southwest San Miguel County, Colo. The positions of these anomalies are at the top surfaces of mudstone strata within, and at the base of, the ore-bearing sandstone of the Salt Wash member of the Morrison formation. The distribution of these anomalies suggests that ground water has leached radioactive material from the ore bodies and has carried it down dip and laterally along the top surfaces of underlying impermeable mudstone strata for distance as great as 300 feet. The anomalies are probably caused by radon and its daughter elements. Preliminary tests indicate that radon in quantities up to 10-7 curies per liter may be present in ground water flowing along sandstone-mudstone contacts under carnotite ore bodies. In comparison, the radium content of the same water is less than 10-10 curies per liter. Further substantiation of the relationship between ore bodies, the movement of water, and the radon-caused anomalies may greatly increase the scope of gamma-ray logs of drill holes as an aid to prospecting.

  6. Mosquito control insecticides: a probabilistic ecological risk assessment on drift exposures of naled, dichlorvos (naled metabolite) and permethrin to adult butterflies.

    PubMed

    Hoang, T C; Rand, G M

    2015-01-01

    A comprehensive probabilistic terrestrial ecological risk assessment (ERA) was conducted to characterize the potential risk of mosquito control insecticide (i.e., naled, it's metabolite dichlorvos, and permethrin) usage to adult butterflies in south Florida by comparing the probability distributions of environmental exposure concentrations following actual mosquito control applications at labeled rates from ten field monitoring studies with the probability distributions of butterfly species response (effects) data from our laboratory acute toxicity studies. The overlap of these distributions was used as a measure of risk to butterflies. The long-term viability (survival) of adult butterflies, following topical (thorax/wings) exposures was the environmental value we wanted to protect. Laboratory acute toxicity studies (24-h LD50) included topical exposures (thorax and wings) to five adult butterfly species and preparation of species sensitivity distributions (SSDs). The ERA indicated that the assessment endpoint of protection, of at least 90% of the species, 90% of the time (or the 10th percentile from the acute SSDs) from acute naled and permethrin exposures, is most likely not occurring when considering topical exposures to adults. Although the surface areas for adulticide exposures are greater for the wings, exposures to the thorax provide the highest potential for risk (i.e., SSD 10th percentile is lowest) for adult butterflies. Dichlorvos appeared to present no risk. The results of this ERA can be applied to other areas of the world, where these insecticides are used and where butterflies may be exposed. Since there are other sources (e.g., agriculture) of pesticides in the environment, where butterfly exposures will occur, the ERA may under-estimate the potential risks under real-world conditions. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less

  8. From anomalies to forecasts: Toward a descriptive model of decisions under risk, under ambiguity, and from experience.

    PubMed

    Erev, Ido; Ert, Eyal; Plonsky, Ori; Cohen, Doron; Cohen, Oded

    2017-07-01

    Experimental studies of choice behavior document distinct, and sometimes contradictory, deviations from maximization. For example, people tend to overweight rare events in 1-shot decisions under risk, and to exhibit the opposite bias when they rely on past experience. The common explanations of these results assume that the contradicting anomalies reflect situation-specific processes that involve the weighting of subjective values and the use of simple heuristics. The current article analyzes 14 choice anomalies that have been described by different models, including the Allais, St. Petersburg, and Ellsberg paradoxes, and the reflection effect. Next, it uses a choice prediction competition methodology to clarify the interaction between the different anomalies. It focuses on decisions under risk (known payoff distributions) and under ambiguity (unknown probabilities), with and without feedback concerning the outcomes of past choices. The results demonstrate that it is not necessary to assume situation-specific processes. The distinct anomalies can be captured by assuming high sensitivity to the expected return and 4 additional tendencies: pessimism, bias toward equal weighting, sensitivity to payoff sign, and an effort to minimize the probability of immediate regret. Importantly, feedback increases sensitivity to probability of regret. Simple abstractions of these assumptions, variants of the model Best Estimate and Sampling Tools (BEAST), allow surprisingly accurate ex ante predictions of behavior. Unlike the popular models, BEAST does not assume subjective weighting functions or cognitive shortcuts. Rather, it assumes the use of sampling tools and reliance on small samples, in addition to the estimation of the expected values. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. An evaluation of procedures to estimate monthly precipitation probabilities

    NASA Astrophysics Data System (ADS)

    Legates, David R.

    1991-01-01

    Many frequency distributions have been used to evaluate monthly precipitation probabilities. Eight of these distributions (including Pearson type III, extreme value, and transform normal probability density functions) are comparatively examined to determine their ability to represent accurately variations in monthly precipitation totals for global hydroclimatological analyses. Results indicate that a modified version of the Box-Cox transform-normal distribution more adequately describes the 'true' precipitation distribution than does any of the other methods. This assessment was made using a cross-validation procedure for a global network of 253 stations for which at least 100 years of monthly precipitation totals were available.

  10. Notes on SAW Tag Interrogation Techniques

    NASA Technical Reports Server (NTRS)

    Barton, Richard J.

    2010-01-01

    We consider the problem of interrogating a single SAW RFID tag with a known ID and known range in the presence of multiple interfering tags under the following assumptions: (1) The RF propagation environment is well approximated as a simple delay channel with geometric power-decay constant alpha >/= 2. (2) The interfering tag IDs are unknown but well approximated as independent, identically distributed random samples from a probability distribution of tag ID waveforms with known second-order properties, and the tag of interest is drawn independently from the same distribution. (3) The ranges of the interfering tags are unknown but well approximated as independent, identically distributed realizations of a random variable rho with a known probability distribution f(sub rho) , and the tag ranges are independent of the tag ID waveforms. In particular, we model the tag waveforms as random impulse responses from a wide-sense-stationary, uncorrelated-scattering (WSSUS) fading channel with known bandwidth and scattering function. A brief discussion of the properties of such channels and the notation used to describe them in this document is given in the Appendix. Under these assumptions, we derive the expression for the output signal-to-noise ratio (SNR) for an arbitrary combination of transmitted interrogation signal and linear receiver filter. Based on this expression, we derive the optimal interrogator configuration (i.e., transmitted signal/receiver filter combination) in the two extreme noise/interference regimes, i.e., noise-limited and interference-limited, under the additional assumption that the coherence bandwidth of the tags is much smaller than the total tag bandwidth. Finally, we evaluate the performance of both optimal interrogators over a broad range of operating scenarios using both numerical simulation based on the assumed model and Monte Carlo simulation based on a small sample of measured tag waveforms. The performance evaluation results not only provide guidelines for proper interrogator design, but also provide some insight on the validity of the assumed signal model. It should be noted that the assumption that the impulse response of the tag of interest is known precisely implies that the temperature and range of the tag are also known precisely, which is generally not the case in practice. However, analyzing interrogator performance under this simplifying assumption is much more straightforward and still provides a great deal of insight into the nature of the problem.

  11. Secure Distributed Detection under Energy Constraint in IoT-Oriented Sensor Networks.

    PubMed

    Zhang, Guomei; Sun, Hao

    2016-12-16

    We study the secure distributed detection problems under energy constraint for IoT-oriented sensor networks. The conventional channel-aware encryption (CAE) is an efficient physical-layer secure distributed detection scheme in light of its energy efficiency, good scalability and robustness over diverse eavesdropping scenarios. However, in the CAE scheme, it remains an open problem of how to optimize the key thresholds for the estimated channel gain, which are used to determine the sensor's reporting action. Moreover, the CAE scheme does not jointly consider the accuracy of local detection results in determining whether to stay dormant for a sensor. To solve these problems, we first analyze the error probability and derive the optimal thresholds in the CAE scheme under a specified energy constraint. These results build a convenient mathematic framework for our further innovative design. Under this framework, we propose a hybrid secure distributed detection scheme. Our proposal can satisfy the energy constraint by keeping some sensors inactive according to the local detection confidence level, which is characterized by likelihood ratio. In the meanwhile, the security is guaranteed through randomly flipping the local decisions forwarded to the fusion center based on the channel amplitude. We further optimize the key parameters of our hybrid scheme, including two local decision thresholds and one channel comparison threshold. Performance evaluation results demonstrate that our hybrid scheme outperforms the CAE under stringent energy constraints, especially in the high signal-to-noise ratio scenario, while the security is still assured.

  12. Secure Distributed Detection under Energy Constraint in IoT-Oriented Sensor Networks

    PubMed Central

    Zhang, Guomei; Sun, Hao

    2016-01-01

    We study the secure distributed detection problems under energy constraint for IoT-oriented sensor networks. The conventional channel-aware encryption (CAE) is an efficient physical-layer secure distributed detection scheme in light of its energy efficiency, good scalability and robustness over diverse eavesdropping scenarios. However, in the CAE scheme, it remains an open problem of how to optimize the key thresholds for the estimated channel gain, which are used to determine the sensor’s reporting action. Moreover, the CAE scheme does not jointly consider the accuracy of local detection results in determining whether to stay dormant for a sensor. To solve these problems, we first analyze the error probability and derive the optimal thresholds in the CAE scheme under a specified energy constraint. These results build a convenient mathematic framework for our further innovative design. Under this framework, we propose a hybrid secure distributed detection scheme. Our proposal can satisfy the energy constraint by keeping some sensors inactive according to the local detection confidence level, which is characterized by likelihood ratio. In the meanwhile, the security is guaranteed through randomly flipping the local decisions forwarded to the fusion center based on the channel amplitude. We further optimize the key parameters of our hybrid scheme, including two local decision thresholds and one channel comparison threshold. Performance evaluation results demonstrate that our hybrid scheme outperforms the CAE under stringent energy constraints, especially in the high signal-to-noise ratio scenario, while the security is still assured. PMID:27999282

  13. Increasing power-law range in avalanche amplitude and energy distributions

    NASA Astrophysics Data System (ADS)

    Navas-Portella, Víctor; Serra, Isabel; Corral, Álvaro; Vives, Eduard

    2018-02-01

    Power-law-type probability density functions spanning several orders of magnitude are found for different avalanche properties. We propose a methodology to overcome empirical constraints that limit the range of truncated power-law distributions. By considering catalogs of events that cover different observation windows, the maximum likelihood estimation of a global power-law exponent is computed. This methodology is applied to amplitude and energy distributions of acoustic emission avalanches in failure-under-compression experiments of a nanoporous silica glass, finding in some cases global exponents in an unprecedented broad range: 4.5 decades for amplitudes and 9.5 decades for energies. In the latter case, however, strict statistical analysis suggests experimental limitations might alter the power-law behavior.

  14. Exact Large-Deviation Statistics for a Nonequilibrium Quantum Spin Chain

    NASA Astrophysics Data System (ADS)

    Žnidarič, Marko

    2014-01-01

    We consider a one-dimensional XX spin chain in a nonequilibrium setting with a Lindblad-type boundary driving. By calculating large-deviation rate function in the thermodynamic limit, a generalization of free energy to a nonequilibrium setting, we obtain a complete distribution of current, including closed expressions for lower-order cumulants. We also identify two phase-transition-like behaviors in either the thermodynamic limit, at which the current probability distribution becomes discontinuous, or at maximal driving, when the range of possible current values changes discontinuously. In the thermodynamic limit the current has a finite upper and lower bound. We also explicitly confirm nonequilibrium fluctuation relation and show that the current distribution is the same under mapping of the coupling strength Γ→1/Γ.

  15. Increasing power-law range in avalanche amplitude and energy distributions.

    PubMed

    Navas-Portella, Víctor; Serra, Isabel; Corral, Álvaro; Vives, Eduard

    2018-02-01

    Power-law-type probability density functions spanning several orders of magnitude are found for different avalanche properties. We propose a methodology to overcome empirical constraints that limit the range of truncated power-law distributions. By considering catalogs of events that cover different observation windows, the maximum likelihood estimation of a global power-law exponent is computed. This methodology is applied to amplitude and energy distributions of acoustic emission avalanches in failure-under-compression experiments of a nanoporous silica glass, finding in some cases global exponents in an unprecedented broad range: 4.5 decades for amplitudes and 9.5 decades for energies. In the latter case, however, strict statistical analysis suggests experimental limitations might alter the power-law behavior.

  16. A numerical 4D Collision Risk Model

    NASA Astrophysics Data System (ADS)

    Schmitt, Pal; Culloch, Ross; Lieber, Lilian; Kregting, Louise

    2017-04-01

    With the growing number of marine renewable energy (MRE) devices being installed across the world, some concern has been raised about the possibility of harming mobile, marine fauna by collision. Although physical contact between a MRE device and an organism has not been reported to date, these novel sub-sea structures pose a challenge for accurately estimating collision risks as part of environmental impact assessments. Even if the animal motion is simplified to linear translation, ignoring likely evasive behaviour, the mathematical problem of establishing an impact probability is not trivial. We present a numerical algorithm to obtain such probability distributions using transient, four-dimensional simulations of a novel marine renewable device concept, Deep Green, Minesto's power plant and hereafter referred to as the 'kite' that flies in a figure-of-eight configuration. Simulations were carried out altering several configurations including kite depth, kite speed and kite trajectory while keeping the speed of the moving object constant. Since the kite assembly is defined as two parts in the model, a tether (attached to the seabed) and the kite, collision risk of each part is reported independently. By comparing the number of collisions with the number of collision-free simulations, a probability of impact for each simulated position in the cross- section of the area is considered. Results suggest that close to the bottom, where the tether amplitude is small, the path is always blocked and the impact probability is 100% as expected. However, higher up in the water column, the collision probability is twice as high in the mid line, where the tether passes twice per period than at the extremes of its trajectory. The collision probability distribution is much more complex in the upper end of the water column, where the kite and tether can simultaneously collide with the object. Results demonstrate the viability of such models, which can also incorporate empirical field data for assessing the probability of collision risk of animals with an MRE device under varying operating conditions.

  17. q-Gaussian distributions of leverage returns, first stopping times, and default risk valuations

    NASA Astrophysics Data System (ADS)

    Katz, Yuri A.; Tian, Li

    2013-10-01

    We study the probability distributions of daily leverage returns of 520 North American industrial companies that survive de-listing during the financial crisis, 2006-2012. We provide evidence that distributions of unbiased leverage returns of all individual firms belong to the class of q-Gaussian distributions with the Tsallis entropic parameter within the interval 1

  18. Cryptographic robustness of practical quantum cryptography: BB84 key distribution protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molotkov, S. N.

    2008-07-15

    In real fiber-optic quantum cryptography systems, the avalanche photodiodes are not perfect, the source of quantum states is not a single-photon one, and the communication channel is lossy. For these reasons, key distribution is impossible under certain conditions for the system parameters. A simple analysis is performed to find relations between the parameters of real cryptography systems and the length of the quantum channel that guarantee secure quantum key distribution when the eavesdropper's capabilities are limited only by fundamental laws of quantum mechanics while the devices employed by the legitimate users are based on current technologies. Critical values are determinedmore » for the rate of secure real-time key generation that can be reached under the current technology level. Calculations show that the upper bound on channel length can be as high as 300 km for imperfect photodetectors (avalanche photodiodes) with present-day quantum efficiency ({eta} {approx} 20%) and dark count probability (p{sub dark} {approx} 10{sup -7})« less

  19. Cryptographic robustness of practical quantum cryptography: BB84 key distribution protocol

    NASA Astrophysics Data System (ADS)

    Molotkov, S. N.

    2008-07-01

    In real fiber-optic quantum cryptography systems, the avalanche photodiodes are not perfect, the source of quantum states is not a single-photon one, and the communication channel is lossy. For these reasons, key distribution is impossible under certain conditions for the system parameters. A simple analysis is performed to find relations between the parameters of real cryptography systems and the length of the quantum channel that guarantee secure quantum key distribution when the eavesdropper’s capabilities are limited only by fundamental laws of quantum mechanics while the devices employed by the legitimate users are based on current technologies. Critical values are determined for the rate of secure real-time key generation that can be reached under the current technology level. Calculations show that the upper bound on channel length can be as high as 300 km for imperfect photodetectors (avalanche photodiodes) with present-day quantum efficiency (η ≈ 20%) and dark count probability ( p dark ˜ 10-7).

  20. Closer look at time averages of the logistic map at the edge of chaos

    NASA Astrophysics Data System (ADS)

    Tirnakli, Ugur; Tsallis, Constantino; Beck, Christian

    2009-05-01

    The probability distribution of sums of iterates of the logistic map at the edge of chaos has been recently shown [U. Tirnakli , Phys. Rev. E 75, 040106(R) (2007)] to be numerically consistent with a q -Gaussian, the distribution which—under appropriate constraints—maximizes the nonadditive entropy Sq , which is the basis of nonextensive statistical mechanics. This analysis was based on a study of the tails of the distribution. We now check the entire distribution, in particular, its central part. This is important in view of a recent q generalization of the central limit theorem, which states that for certain classes of strongly correlated random variables the rescaled sum approaches a q -Gaussian limit distribution. We numerically investigate for the logistic map with a parameter in a small vicinity of the critical point under which conditions there is convergence to a q -Gaussian both in the central region and in the tail region and find a scaling law involving the Feigenbaum constant δ . Our results are consistent with a large number of already available analytical and numerical evidences that the edge of chaos is well described in terms of the entropy Sq and its associated concepts.

  1. Applications of the principle of maximum entropy: from physics to ecology.

    PubMed

    Banavar, Jayanth R; Maritan, Amos; Volkov, Igor

    2010-02-17

    There are numerous situations in physics and other disciplines which can be described at different levels of detail in terms of probability distributions. Such descriptions arise either intrinsically as in quantum mechanics, or because of the vast amount of details necessary for a complete description as, for example, in Brownian motion and in many-body systems. We show that an application of the principle of maximum entropy for estimating the underlying probability distribution can depend on the variables used for describing the system. The choice of characterization of the system carries with it implicit assumptions about fundamental attributes such as whether the system is classical or quantum mechanical or equivalently whether the individuals are distinguishable or indistinguishable. We show that the correct procedure entails the maximization of the relative entropy subject to known constraints and, additionally, requires knowledge of the behavior of the system in the absence of these constraints. We present an application of the principle of maximum entropy to understanding species diversity in ecology and introduce a new statistical ensemble corresponding to the distribution of a variable population of individuals into a set of species not defined a priori.

  2. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    PubMed

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  3. Characterizing the Lyman-alpha forest flux probability distribution function using Legendre polynomials

    NASA Astrophysics Data System (ADS)

    Cieplak, Agnieszka; Slosar, Anze

    2017-01-01

    The Lyman-alpha forest has become a powerful cosmological probe of the underlying matter distribution at high redshift. It is a highly non-linear field with much information present beyond the two-point statistics of the power spectrum. The flux probability distribution function (PDF) in particular has been used as a successful probe of small-scale physics. In addition to the cosmological evolution however, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over the binned PDF as is commonly done. Since the n-th coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. In addition, we use hydrodynamic cosmological simulations to demonstrate that in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a finite small number of well-measured quantities.

  4. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    PubMed Central

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345

  5. The Approximate Bayesian Computation methods in the localization of the atmospheric contamination source

    NASA Astrophysics Data System (ADS)

    Kopka, P.; Wawrzynczak, A.; Borysiewicz, M.

    2015-09-01

    In many areas of application, a central problem is a solution to the inverse problem, especially estimation of the unknown model parameters to model the underlying dynamics of a physical system precisely. In this situation, the Bayesian inference is a powerful tool to combine observed data with prior knowledge to gain the probability distribution of searched parameters. We have applied the modern methodology named Sequential Approximate Bayesian Computation (S-ABC) to the problem of tracing the atmospheric contaminant source. The ABC is technique commonly used in the Bayesian analysis of complex models and dynamic system. Sequential methods can significantly increase the efficiency of the ABC. In the presented algorithm, the input data are the on-line arriving concentrations of released substance registered by distributed sensor network from OVER-LAND ATMOSPHERIC DISPERSION (OLAD) experiment. The algorithm output are the probability distributions of a contamination source parameters i.e. its particular location, release rate, speed and direction of the movement, start time and duration. The stochastic approach presented in this paper is completely general and can be used in other fields where the parameters of the model bet fitted to the observable data should be found.

  6. Decisions with Uncertain Consequences—A Total Ordering on Loss-Distributions

    PubMed Central

    König, Sandra; Schauer, Stefan

    2016-01-01

    Decisions are often based on imprecise, uncertain or vague information. Likewise, the consequences of an action are often equally unpredictable, thus putting the decision maker into a twofold jeopardy. Assuming that the effects of an action can be modeled by a random variable, then the decision problem boils down to comparing different effects (random variables) by comparing their distribution functions. Although the full space of probability distributions cannot be ordered, a properly restricted subset of distributions can be totally ordered in a practically meaningful way. We call these loss-distributions, since they provide a substitute for the concept of loss-functions in decision theory. This article introduces the theory behind the necessary restrictions and the hereby constructible total ordering on random loss variables, which enables decisions under uncertainty of consequences. Using data obtained from simulations, we demonstrate the practical applicability of our approach. PMID:28030572

  7. Probability weighted moments: Definition and relation to parameters of several distributions expressable in inverse form

    USGS Publications Warehouse

    Greenwood, J. Arthur; Landwehr, J. Maciunas; Matalas, N.C.; Wallis, J.R.

    1979-01-01

    Distributions whose inverse forms are explicitly defined, such as Tukey's lambda, may present problems in deriving their parameters by more conventional means. Probability weighted moments are introduced and shown to be potentially useful in expressing the parameters of these distributions.

  8. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  9. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  10. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    NASA Astrophysics Data System (ADS)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  11. Implementation of a Collision Probability Prediction Technique for Constellation Maneuver Planning

    NASA Technical Reports Server (NTRS)

    Concha, Marco a.

    2007-01-01

    On March 22, 2006, the Space Technology 5 (ST5) constellation spacecraft were successfully delivered to orbit by a Pegasus XI, launch vehicle. An unexpected relative motion experienced by the constellation after orbit insertion brought about a problem. Soon after launch the observed relative position of the inert rocket body was between the leading and the middle spacecraft within the constellation. The successful planning and execution of an orbit maneuver that would create a fly-by of the rocket body was required to establish the.formation. This maneuver would create a close approach that needed to conform to predefined collision probability requirements. On April 21, 2006, the ST5 "155" spacecraft performed a large orbit maneuver and successfully passed the inert Pegasus 3rd Stage Rocket Body on April 30, 2006 15:20 UTC at a distance of 2.55 km with a Probability of Collision of less than 1.0E-06. This paper will outline the technique that was implemented to establish the safe planning and execution of the fly-by maneuver. The method makes use of Gaussian distribution models of state covariance to determine underlying probabilities of collision that arise under low velocity encounters. Specific numerical examples used for this analysis are discussed in detail. The mechanics of this technique are explained to foster deeper understanding of the concepts presented and to improve existing processes for use in future constellation maneuver planning.

  12. Simultaneous population pharmacokinetic modelling of plasma and intracellular PBMC miltefosine concentrations in New World cutaneous leishmaniasis and exploration of exposure-response relationships.

    PubMed

    Kip, Anke E; Castro, María Del Mar; Gomez, Maria Adelaida; Cossio, Alexandra; Schellens, Jan H M; Beijnen, Jos H; Saravia, Nancy Gore; Dorlo, Thomas P C

    2018-05-10

    Leishmania parasites reside within macrophages and the direct target of antileishmanial drugs is therefore intracellular. We aimed to characterize the intracellular PBMC miltefosine kinetics by developing a population pharmacokinetic (PK) model simultaneously describing plasma and intracellular PBMC pharmacokinetics. Furthermore, we explored exposure-response relationships and simulated alternative dosing regimens. A population PK model was developed with NONMEM, based on 339 plasma and 194 PBMC miltefosine concentrations from Colombian cutaneous leishmaniasis patients [29 children (2-12 years old) and 22 adults] receiving 1.8-2.5 mg/kg/day miltefosine for 28 days. A three-compartment model with miltefosine distribution into an intracellular PBMC effect compartment best fitted the data. Intracellular PBMC distribution was described with an intracellular-to-plasma concentration ratio of 2.17 [relative standard error (RSE) 4.9%] and intracellular distribution rate constant of 1.23 day-1 (RSE 14%). In exploring exposure-response relationships, both plasma and intracellular model-based exposure estimates significantly influenced probability of cure. A proposed PK target for the area under the plasma concentration-time curve (day 0-28) of >535 mg·day/L corresponded to >95% probability of cure. In linear dosing simulations, 18.3% of children compared with 2.8% of adults failed to reach 535 mg·day/L. In children, this decreased to 1.8% after allometric dosing simulation. The developed population PK model described the rate and extent of miltefosine distribution from plasma into PBMCs. Miltefosine exposure was significantly related to probability of cure in this cutaneous leishmaniasis patient population. We propose an exploratory PK target, which should be validated in a larger cohort study.

  13. CUMBIN - CUMULATIVE BINOMIAL PROGRAMS

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.

  14. A Risk Management Method for the Operation of a Supply-Chain without Storage:

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yasuhiro; Manabe, Yuuji; Nakata, Norimasa; Kusaka, Satoshi

    A business risk management method has been developed for a supply-chain without a storage function under demand uncertainty. Power supply players in the deregulated power market face the need to develop the best policies for power supply from self-production and reserved purchases to balance demand, which is predictable with error. The proposed method maximizes profit from the operation of the supply-chain under probabilistic demand uncertainty on the basis of a probabilistic programming approach. Piece-wise linear functions are employed to formulate the impact of under-booked or over-booked purchases on the supply cost, and constraints on over-demand probability are introduced to limit over-demand frequency on the basis of the demand probability distribution. The developed method has been experimentally applied to the supply policy of a power-supply-chain, the operation of which is based on a 3-stage pricing purchase contract and on 28 time zones. The characteristics of the obtained optimal supply policy are successfully captured in the numerical results, which suggest the applicability of the proposed method.

  15. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Technical Reports Server (NTRS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  16. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Astrophysics Data System (ADS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  17. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  18. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  19. Probable errors in width distributions of sea ice leads measured along a transect

    NASA Technical Reports Server (NTRS)

    Key, J.; Peckham, S.

    1991-01-01

    The degree of error expected in the measurement of widths of sea ice leads along a single transect are examined in a probabilistic sense under assumed orientation and width distributions, where both isotropic and anisotropic lead orientations are examined. Methods are developed for estimating the distribution of 'actual' widths (measured perpendicular to the local lead orientation) knowing the 'apparent' width distribution (measured along the transect), and vice versa. The distribution of errors, defined as the difference between the actual and apparent lead width, can be estimated from the two width distributions, and all moments of this distribution can be determined. The problem is illustrated with Landsat imagery and the procedure is applied to a submarine sonar transect. Results are determined for a range of geometries, and indicate the importance of orientation information if data sampled along a transect are to be used for the description of lead geometries. While the application here is to sea ice leads, the methodology can be applied to measurements of any linear feature.

  20. Hybrid computer technique yields random signal probability distributions

    NASA Technical Reports Server (NTRS)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  1. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    NASA Astrophysics Data System (ADS)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  2. Automatic Monitoring System Design and Failure Probability Analysis for River Dikes on Steep Channel

    NASA Astrophysics Data System (ADS)

    Chang, Yin-Lung; Lin, Yi-Jun; Tung, Yeou-Koung

    2017-04-01

    The purposes of this study includes: (1) design an automatic monitoring system for river dike; and (2) develop a framework which enables the determination of dike failure probabilities for various failure modes during a rainstorm. The historical dike failure data collected in this study indicate that most dikes in Taiwan collapsed under the 20-years return period discharge, which means the probability of dike failure is much higher than that of overtopping. We installed the dike monitoring system on the Chiu-She Dike which located on the middle stream of Dajia River, Taiwan. The system includes: (1) vertical distributed pore water pressure sensors in front of and behind the dike; (2) Time Domain Reflectometry (TDR) to measure the displacement of dike; (3) wireless floating device to measure the scouring depth at the toe of dike; and (4) water level gauge. The monitoring system recorded the variation of pore pressure inside the Chiu-She Dike and the scouring depth during Typhoon Megi. The recorded data showed that the highest groundwater level insides the dike occurred 15 hours after the peak discharge. We developed a framework which accounts for the uncertainties from return period discharge, Manning's n, scouring depth, soil cohesion, and friction angle and enables the determination of dike failure probabilities for various failure modes such as overtopping, surface erosion, mass failure, toe sliding and overturning. The framework was applied to Chiu-She, Feng-Chou, and Ke-Chuang Dikes on Dajia River. The results indicate that the toe sliding or overturning has the highest probability than other failure modes. Furthermore, the overall failure probability (integrate different failure modes) reaches 50% under 10-years return period flood which agrees with the historical failure data for the study reaches.

  3. 40 CFR Appendix C to Part 191 - Guidance for Implementation of Subpart B

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that the remaining probability distribution of cumulative releases would not be significantly changed... with § 191.13 into a “complementary cumulative distribution function” that indicates the probability of... distribution function for each disposal system considered. The Agency assumes that a disposal system can be...

  4. 40 CFR Appendix C to Part 191 - Guidance for Implementation of Subpart B

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that the remaining probability distribution of cumulative releases would not be significantly changed... with § 191.13 into a “complementary cumulative distribution function” that indicates the probability of... distribution function for each disposal system considered. The Agency assumes that a disposal system can be...

  5. Optimal methods for fitting probability distributions to propagule retention time in studies of zoochorous dispersal.

    PubMed

    Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi

    2016-02-01

    Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We recommend the use of cumulative probability to fit parametric probability distributions to propagule retention time, specifically using maximum likelihood for parameter estimation. Furthermore, the experimental design for an optimal characterization of unimodal propagule retention time should contemplate at least 500 recovered propagules and sampling time-intervals not larger than the time peak of propagule retrieval, except in the tail of the distribution where broader sampling time-intervals may also produce accurate fits.

  6. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    ERIC Educational Resources Information Center

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  7. Pseudo Bayes Estimates for Test Score Distributions and Chained Equipercentile Equating. Research Report. ETS RR-09-47

    ERIC Educational Resources Information Center

    Moses, Tim; Oh, Hyeonjoo J.

    2009-01-01

    Pseudo Bayes probability estimates are weighted averages of raw and modeled probabilities; these estimates have been studied primarily in nonpsychometric contexts. The purpose of this study was to evaluate pseudo Bayes probability estimates as applied to the estimation of psychometric test score distributions and chained equipercentile equating…

  8. Determination of the Crack Resistance Parameters at Equipment Nozzle Zones Under the Seismic Loads Via Finite Element Method

    NASA Astrophysics Data System (ADS)

    Kyrychok, Vladyslav; Torop, Vasyl

    2018-03-01

    The present paper is devoted to the problem of the assessment of probable crack growth at pressure vessel nozzles zone under the cyclic seismic loads. The approaches to creating distributed pipeline systems, connected to equipment are being proposed. The possibility of using in common different finite element program packages for accurate estimation of the strength of bonded pipelines and pressure vessels systems is shown and justified. The authors propose checking the danger of defects in nozzle domain, evaluate the residual life of the system, basing on the developed approach.

  9. Comparison of three-parameter probability distributions for representing annual extreme and partial duration precipitation series

    NASA Astrophysics Data System (ADS)

    Wilks, Daniel S.

    1993-10-01

    Performance of 8 three-parameter probability distributions for representing annual extreme and partial duration precipitation data at stations in the northeastern and southeastern United States is investigated. Particular attention is paid to fidelity on the right tail, through use of a bootstrap procedure simulating extrapolation on the right tail beyond the data. It is found that the beta-κ distribution best describes the extreme right tail of annual extreme series, and the beta-P distribution is best for the partial duration data. The conventionally employed two-parameter Gumbel distribution is found to substantially underestimate probabilities associated with the larger precipitation amounts for both annual extreme and partial duration data. Fitting the distributions using left-censored data did not result in improved fits to the right tail.

  10. Stochasticity and Spatial Interaction Govern Stem Cell Differentiation Dynamics

    NASA Astrophysics Data System (ADS)

    Smith, Quinton; Stukalin, Evgeny; Kusuma, Sravanti; Gerecht, Sharon; Sun, Sean X.

    2015-07-01

    Stem cell differentiation underlies many fundamental processes such as development, tissue growth and regeneration, as well as disease progression. Understanding how stem cell differentiation is controlled in mixed cell populations is an important step in developing quantitative models of cell population dynamics. Here we focus on quantifying the role of cell-cell interactions in determining stem cell fate. Toward this, we monitor stem cell differentiation in adherent cultures on micropatterns and collect statistical cell fate data. Results show high cell fate variability and a bimodal probability distribution of stem cell fraction on small (80-140 μm diameter) micropatterns. On larger (225-500 μm diameter) micropatterns, the variability is also high but the distribution of the stem cell fraction becomes unimodal. Using a stochastic model, we analyze the differentiation dynamics and quantitatively determine the differentiation probability as a function of stem cell fraction. Results indicate that stem cells can interact and sense cellular composition in their immediate neighborhood and adjust their differentiation probability accordingly. Blocking epithelial cadherin (E-cadherin) can diminish this cell-cell contact mediated sensing. For larger micropatterns, cell motility adds a spatial dimension to the picture. Taken together, we find stochasticity and cell-cell interactions are important factors in determining cell fate in mixed cell populations.

  11. Representation of analysis results involving aleatory and epistemic uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less

  12. On the SIMS Ionization Probability of Organic Molecules.

    PubMed

    Popczun, Nicholas J; Breuer, Lars; Wucher, Andreas; Winograd, Nicholas

    2017-06-01

    The prospect of improved secondary ion yields for secondary ion mass spectrometry (SIMS) experiments drives innovation of new primary ion sources, instrumentation, and post-ionization techniques. The largest factor affecting secondary ion efficiency is believed to be the poor ionization probability (α + ) of sputtered material, a value rarely measured directly, but estimated to be in some cases as low as 10 -5 . Our lab has developed a method for the direct determination of α + in a SIMS experiment using laser post-ionization (LPI) to detect neutral molecular species in the sputtered plume for an organic compound. Here, we apply this method to coronene (C 24 H 12 ), a polyaromatic hydrocarbon that exhibits strong molecular signal during gas-phase photoionization. A two-dimensional spatial distribution of sputtered neutral molecules is measured and presented. It is shown that the ionization probability of molecular coronene desorbed from a clean film under bombardment with 40 keV C 60 cluster projectiles is of the order of 10 -3 , with some remaining uncertainty arising from laser-induced fragmentation and possible differences in the emission velocity distributions of neutral and ionized molecules. In general, this work establishes a method to estimate the ionization efficiency of molecular species sputtered during a single bombardment event. Graphical Abstract .

  13. Human instrumental performance in ratio and interval contingencies: A challenge for associative theory.

    PubMed

    Pérez, Omar D; Aitken, Michael R F; Zhukovsky, Peter; Soto, Fabián A; Urcelay, Gonzalo P; Dickinson, Anthony

    2016-12-15

    Associative learning theories regard the probability of reinforcement as the critical factor determining responding. However, the role of this factor in instrumental conditioning is not completely clear. In fact, free-operant experiments show that participants respond at a higher rate on variable ratio than on variable interval schedules even though the reinforcement probability is matched between the schedules. This difference has been attributed to the differential reinforcement of long inter-response times (IRTs) by interval schedules, which acts to slow responding. In the present study, we used a novel experimental design to investigate human responding under random ratio (RR) and regulated probability interval (RPI) schedules, a type of interval schedule that sets a reinforcement probability independently of the IRT duration. Participants responded on each type of schedule before a final choice test in which they distributed responding between two schedules similar to those experienced during training. Although response rates did not differ during training, the participants responded at a lower rate on the RPI schedule than on the matched RR schedule during the choice test. This preference cannot be attributed to a higher probability of reinforcement for long IRTs and questions the idea that similar associative processes underlie classical and instrumental conditioning.

  14. The Failure Models of Lead Free Sn-3.0Ag-0.5Cu Solder Joint Reliability Under Low-G and High-G Drop Impact

    NASA Astrophysics Data System (ADS)

    Gu, Jian; Lei, YongPing; Lin, Jian; Fu, HanGuang; Wu, Zhongwei

    2017-02-01

    The reliability of Sn-3.0Ag-0.5Cu (SAC 305) solder joint under a broad level of drop impacts was studied. The failure performance of solder joint, failure probability and failure position were analyzed under two shock test conditions, i.e., 1000 g for 1 ms and 300 g for 2 ms. The stress distribution on the solder joint was calculated by ABAQUS. The results revealed that the dominant reason was the tension due to the difference in stiffness between the print circuit board and ball grid array, and the maximum tension of 121.1 MPa and 31.1 MPa, respectively, under both 1000 g or 300 g drop impact, was focused on the corner of the solder joint which was located in the outmost corner of the solder ball row. The failure modes were summarized into the following four modes: initiation and propagation through the (1) intermetallic compound layer, (2) Ni layer, (3) Cu pad, or (4) Sn-matrix. The outmost corner of the solder ball row had a high failure probability under both 1000 g and 300 g drop impact. The number of failures of solder ball under the 300 g drop impact was higher than that under the 1000 g drop impact. The characteristic drop values for failure were 41 and 15,199, respectively, following the statistics.

  15. The estimation of tree posterior probabilities using conditional clade probability distributions.

    PubMed

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  16. Intertime jump statistics of state-dependent Poisson processes.

    PubMed

    Daly, Edoardo; Porporato, Amilcare

    2007-01-01

    A method to obtain the probability distribution of the interarrival times of jump occurrences in systems driven by state-dependent Poisson noise is proposed. Such a method uses the survivor function obtained by a modified version of the master equation associated to the stochastic process under analysis. A model for the timing of human activities shows the capability of state-dependent Poisson noise to generate power-law distributions. The application of the method to a model for neuron dynamics and to a hydrological model accounting for land-atmosphere interaction elucidates the origin of characteristic recurrence intervals and possible persistence in state-dependent Poisson models.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langan, Roisin T.; Archibald, Richard K.; Lamberti, Vincent

    We have applied a new imputation-based method for analyzing incomplete data, called Monte Carlo Bayesian Database Generation (MCBDG), to the Spent Fuel Isotopic Composition (SFCOMPO) database. About 60% of the entries are absent for SFCOMPO. The method estimates missing values of a property from a probability distribution created from the existing data for the property, and then generates multiple instances of the completed database for training a machine learning algorithm. Uncertainty in the data is represented by an empirical or an assumed error distribution. The method makes few assumptions about the underlying data, and compares favorably against results obtained bymore » replacing missing information with constant values.« less

  18. Structured Modeling and Analysis of Stochastic Epidemics with Immigration and Demographic Effects

    PubMed Central

    Baumann, Hendrik; Sandmann, Werner

    2016-01-01

    Stochastic epidemics with open populations of variable population sizes are considered where due to immigration and demographic effects the epidemic does not eventually die out forever. The underlying stochastic processes are ergodic multi-dimensional continuous-time Markov chains that possess unique equilibrium probability distributions. Modeling these epidemics as level-dependent quasi-birth-and-death processes enables efficient computations of the equilibrium distributions by matrix-analytic methods. Numerical examples for specific parameter sets are provided, which demonstrates that this approach is particularly well-suited for studying the impact of varying rates for immigration, births, deaths, infection, recovery from infection, and loss of immunity. PMID:27010993

  19. Structured Modeling and Analysis of Stochastic Epidemics with Immigration and Demographic Effects.

    PubMed

    Baumann, Hendrik; Sandmann, Werner

    2016-01-01

    Stochastic epidemics with open populations of variable population sizes are considered where due to immigration and demographic effects the epidemic does not eventually die out forever. The underlying stochastic processes are ergodic multi-dimensional continuous-time Markov chains that possess unique equilibrium probability distributions. Modeling these epidemics as level-dependent quasi-birth-and-death processes enables efficient computations of the equilibrium distributions by matrix-analytic methods. Numerical examples for specific parameter sets are provided, which demonstrates that this approach is particularly well-suited for studying the impact of varying rates for immigration, births, deaths, infection, recovery from infection, and loss of immunity.

  20. Confidence as Bayesian Probability: From Neural Origins to Behavior.

    PubMed

    Meyniel, Florent; Sigman, Mariano; Mainen, Zachary F

    2015-10-07

    Research on confidence spreads across several sub-fields of psychology and neuroscience. Here, we explore how a definition of confidence as Bayesian probability can unify these viewpoints. This computational view entails that there are distinct forms in which confidence is represented and used in the brain, including distributional confidence, pertaining to neural representations of probability distributions, and summary confidence, pertaining to scalar summaries of those distributions. Summary confidence is, normatively, derived or "read out" from distributional confidence. Neural implementations of readout will trade off optimality versus flexibility of routing across brain systems, allowing confidence to serve diverse cognitive functions. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Exact probability distribution functions for Parrondo's games

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Saakian, David B.; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  2. Exact probability distribution functions for Parrondo's games.

    PubMed

    Zadourian, Rubina; Saakian, David B; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  3. Order-restricted inference for means with missing values.

    PubMed

    Wang, Heng; Zhong, Ping-Shou

    2017-09-01

    Missing values appear very often in many applications, but the problem of missing values has not received much attention in testing order-restricted alternatives. Under the missing at random (MAR) assumption, we impute the missing values nonparametrically using kernel regression. For data with imputation, the classical likelihood ratio test designed for testing the order-restricted means is no longer applicable since the likelihood does not exist. This article proposes a novel method for constructing test statistics for assessing means with an increasing order or a decreasing order based on jackknife empirical likelihood (JEL) ratio. It is shown that the JEL ratio statistic evaluated under the null hypothesis converges to a chi-bar-square distribution, whose weights depend on missing probabilities and nonparametric imputation. Simulation study shows that the proposed test performs well under various missing scenarios and is robust for normally and nonnormally distributed data. The proposed method is applied to an Alzheimer's disease neuroimaging initiative data set for finding a biomarker for the diagnosis of the Alzheimer's disease. © 2017, The International Biometric Society.

  4. On the Black-Scholes European Option Pricing Model Robustness and Generality

    NASA Astrophysics Data System (ADS)

    Takada, Hellinton Hatsuo; de Oliveira Siqueira, José

    2008-11-01

    The common presentation of the widely known and accepted Black-Scholes European option pricing model explicitly imposes some restrictions such as the geometric Brownian motion assumption for the underlying stock price. In this paper, these usual restrictions are relaxed using maximum entropy principle of information theory, Pearson's distribution system, market frictionless and risk-neutrality theories to the calculation of a unique risk-neutral probability measure calibrated with market parameters.

  5. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  6. Modeling late rectal toxicities based on a parameterized representation of the 3D dose distribution

    NASA Astrophysics Data System (ADS)

    Buettner, Florian; Gulliford, Sarah L.; Webb, Steve; Partridge, Mike

    2011-04-01

    Many models exist for predicting toxicities based on dose-volume histograms (DVHs) or dose-surface histograms (DSHs). This approach has several drawbacks as firstly the reduction of the dose distribution to a histogram results in the loss of spatial information and secondly the bins of the histograms are highly correlated with each other. Furthermore, some of the complex nonlinear models proposed in the past lack a direct physical interpretation and the ability to predict probabilities rather than binary outcomes. We propose a parameterized representation of the 3D distribution of the dose to the rectal wall which explicitly includes geometrical information in the form of the eccentricity of the dose distribution as well as its lateral and longitudinal extent. We use a nonlinear kernel-based probabilistic model to predict late rectal toxicity based on the parameterized dose distribution and assessed its predictive power using data from the MRC RT01 trial (ISCTRN 47772397). The endpoints under consideration were rectal bleeding, loose stools, and a global toxicity score. We extract simple rules identifying 3D dose patterns related to a specifically low risk of complication. Normal tissue complication probability (NTCP) models based on parameterized representations of geometrical and volumetric measures resulted in areas under the curve (AUCs) of 0.66, 0.63 and 0.67 for predicting rectal bleeding, loose stools and global toxicity, respectively. In comparison, NTCP models based on standard DVHs performed worse and resulted in AUCs of 0.59 for all three endpoints. In conclusion, we have presented low-dimensional, interpretable and nonlinear NTCP models based on the parameterized representation of the dose to the rectal wall. These models had a higher predictive power than models based on standard DVHs and their low dimensionality allowed for the identification of 3D dose patterns related to a low risk of complication.

  7. Completion of the Edward Air Force Base Statistical Guidance Wind Tool

    NASA Technical Reports Server (NTRS)

    Dreher, Joseph G.

    2008-01-01

    The goal of this task was to develop a GUI using EAFB wind tower data similar to the KSC SLF peak wind tool that is already in operations at SMG. In 2004, MSFC personnel began work to replicate the KSC SLF tool using several wind towers at EAFB. They completed the analysis and QC of the data, but due to higher priority work did not start development of the GUI. MSFC personnel calculated wind climatologies and probabilities of 10-minute peak wind occurrence based on the 2-minute average wind speed for several EAFB wind towers. Once the data were QC'ed and analyzed the climatologies were calculated following the methodology outlined in Lambert (2003). The climatologies were calculated for each tower and month, and then were stratified by hour, direction (10" sectors), and direction (45" sectors)/hour. For all climatologies, MSFC calculated the mean, standard deviation and observation counts of the Zminute average and 10-minute peak wind speeds. MSFC personnel also calculated empirical and modeled probabilities of meeting or exceeding specific 10- minute peak wind speeds using PDFs. The empirical PDFs were asymmetrical and bounded on the left by the 2- minute average wind speed. They calculated the parametric PDFs by fitting the GEV distribution to the empirical distributions. Parametric PDFs were calculated in order to smooth and interpolate over variations in the observed values due to possible under-sampling of certain peak winds and to estimate probabilities associated with average winds outside the observed range. MSFC calculated the individual probabilities of meeting or exceeding specific 10- minute peak wind speeds by integrating the area under each curve. The probabilities assist SMG forecasters in assessing the shuttle FR for various Zminute average wind speeds. The A M ' obtained the processed EAFB data from Dr. Lee Bums of MSFC and reformatted them for input to Excel PivotTables, which allow users to display different values with point-click-drag techniques. The GUI was created from the PivotTables using VBA code. It is run through a macro within Excel and allows forecasters to quickly display and interpret peak wind climatology and probabilities in a fast-paced operational environment. The GUI was designed to look and operate exactly the same as the KSC SLF tool since SMG forecasters were already familiar with that product. SMG feedback was continually incorporated into the GUI ensuring the end product met their needs. The final version of the GUI along with all climatologies, PDFs, and probabilities has been delivered to SMG and will be put into operational use.

  8. What Can Quantum Optics Say about Computational Complexity Theory?

    NASA Astrophysics Data System (ADS)

    Rahimi-Keshari, Saleh; Lund, Austin P.; Ralph, Timothy C.

    2015-02-01

    Considering the problem of sampling from the output photon-counting probability distribution of a linear-optical network for input Gaussian states, we obtain results that are of interest from both quantum theory and the computational complexity theory point of view. We derive a general formula for calculating the output probabilities, and by considering input thermal states, we show that the output probabilities are proportional to permanents of positive-semidefinite Hermitian matrices. It is believed that approximating permanents of complex matrices in general is a #P-hard problem. However, we show that these permanents can be approximated with an algorithm in the BPPNP complexity class, as there exists an efficient classical algorithm for sampling from the output probability distribution. We further consider input squeezed-vacuum states and discuss the complexity of sampling from the probability distribution at the output.

  9. Predicting future changes in Muskegon River Watershed game fish distributions under future land cover alteration and climate change scenarios

    USGS Publications Warehouse

    Steen, Paul J.; Wiley, Michael J.; Schaeffer, Jeffrey S.

    2010-01-01

    Future alterations in land cover and climate are likely to cause substantial changes in the ranges of fish species. Predictive distribution models are an important tool for assessing the probability that these changes will cause increases or decreases in or the extirpation of species. Classification tree models that predict the probability of game fish presence were applied to the streams of the Muskegon River watershed, Michigan. The models were used to study three potential future scenarios: (1) land cover change only, (2) land cover change and a 3°C increase in air temperature by 2100, and (3) land cover change and a 5°C increase in air temperature by 2100. The analysis indicated that the expected change in air temperature and subsequent change in water temperatures would result in the decline of coldwater fish in the Muskegon watershed by the end of the 21st century while cool- and warmwater species would significantly increase their ranges. The greatest decline detected was a 90% reduction in the probability that brook trout Salvelinus fontinalis would occur in Bigelow Creek. The greatest increase was a 276% increase in the probability that northern pike Esox lucius would occur in the Middle Branch River. Changes in land cover are expected to cause large changes in a few fish species, such as walleye Sander vitreus and Chinook salmon Oncorhynchus tshawytscha, but not to drive major changes in species composition. Managers can alter stream environmental conditions to maximize the probability that species will reside in particular stream reaches through application of the classification tree models. Such models represent a good way to predict future changes, as they give quantitative estimates of the n-dimensional niches for particular species.

  10. Importance of spatial autocorrelation in modeling bird distributions at a continental scale

    USGS Publications Warehouse

    Bahn, V.; O'Connor, R.J.; Krohn, W.B.

    2006-01-01

    Spatial autocorrelation in species' distributions has been recognized as inflating the probability of a type I error in hypotheses tests, causing biases in variable selection, and violating the assumption of independence of error terms in models such as correlation or regression. However, it remains unclear whether these problems occur at all spatial resolutions and extents, and under which conditions spatially explicit modeling techniques are superior. Our goal was to determine whether spatial models were superior at large extents and across many different species. In addition, we investigated the importance of purely spatial effects in distribution patterns relative to the variation that could be explained through environmental conditions. We studied distribution patterns of 108 bird species in the conterminous United States using ten years of data from the Breeding Bird Survey. We compared the performance of spatially explicit regression models with non-spatial regression models using Akaike's information criterion. In addition, we partitioned the variance in species distributions into an environmental, a pure spatial and a shared component. The spatially-explicit conditional autoregressive regression models strongly outperformed the ordinary least squares regression models. In addition, partialling out the spatial component underlying the species' distributions showed that an average of 17% of the explained variation could be attributed to purely spatial effects independent of the spatial autocorrelation induced by the underlying environmental variables. We concluded that location in the range and neighborhood play an important role in the distribution of species. Spatially explicit models are expected to yield better predictions especially for mobile species such as birds, even in coarse-grained models with a large extent. ?? Ecography.

  11. Vacuum quantum stress tensor fluctuations: A diagonalization approach

    NASA Astrophysics Data System (ADS)

    Schiappacasse, Enrico D.; Fewster, Christopher J.; Ford, L. H.

    2018-01-01

    Large vacuum fluctuations of a quantum stress tensor can be described by the asymptotic behavior of its probability distribution. Here we focus on stress tensor operators which have been averaged with a sampling function in time. The Minkowski vacuum state is not an eigenstate of the time-averaged operator, but can be expanded in terms of its eigenstates. We calculate the probability distribution and the cumulative probability distribution for obtaining a given value in a measurement of the time-averaged operator taken in the vacuum state. In these calculations, we study a specific operator that contributes to the stress-energy tensor of a massless scalar field in Minkowski spacetime, namely, the normal ordered square of the time derivative of the field. We analyze the rate of decrease of the tail of the probability distribution for different temporal sampling functions, such as compactly supported functions and the Lorentzian function. We find that the tails decrease relatively slowly, as exponentials of fractional powers, in agreement with previous work using the moments of the distribution. Our results lend additional support to the conclusion that large vacuum stress tensor fluctuations are more probable than large thermal fluctuations, and may have observable effects.

  12. Measurements of gas hydrate formation probability distributions on a quasi-free water droplet

    NASA Astrophysics Data System (ADS)

    Maeda, Nobuo

    2014-06-01

    A High Pressure Automated Lag Time Apparatus (HP-ALTA) can measure gas hydrate formation probability distributions from water in a glass sample cell. In an HP-ALTA gas hydrate formation originates near the edges of the sample cell and gas hydrate films subsequently grow across the water-guest gas interface. It would ideally be desirable to be able to measure gas hydrate formation probability distributions of a single water droplet or mist that is freely levitating in a guest gas, but this is technically challenging. The next best option is to let a water droplet sit on top of a denser, immiscible, inert, and wall-wetting hydrophobic liquid to avoid contact of a water droplet with the solid walls. Here we report the development of a second generation HP-ALTA which can measure gas hydrate formation probability distributions of a water droplet which sits on a perfluorocarbon oil in a container that is coated with 1H,1H,2H,2H-Perfluorodecyltriethoxysilane. It was found that the gas hydrate formation probability distributions of such a quasi-free water droplet were significantly lower than those of water in a glass sample cell.

  13. Probabilistic approach to lysozyme crystal nucleation kinetics.

    PubMed

    Dimitrov, Ivaylo L; Hodzhaoglu, Feyzim V; Koleva, Dobryana P

    2015-09-01

    Nucleation of lysozyme crystals in quiescent solutions at a regime of progressive nucleation is investigated under an optical microscope at conditions of constant supersaturation. A method based on the stochastic nature of crystal nucleation and using discrete time sampling of small solution volumes for the presence or absence of detectable crystals is developed. It allows probabilities for crystal detection to be experimentally estimated. One hundred single samplings were used for each probability determination for 18 time intervals and six lysozyme concentrations. Fitting of a particular probability function to experimentally obtained data made possible the direct evaluation of stationary rates for lysozyme crystal nucleation, the time for growth of supernuclei to a detectable size and probability distribution of nucleation times. Obtained stationary nucleation rates were then used for the calculation of other nucleation parameters, such as the kinetic nucleation factor, nucleus size, work for nucleus formation and effective specific surface energy of the nucleus. The experimental method itself is simple and adaptable and can be used for crystal nucleation studies of arbitrary soluble substances with known solubility at particular solution conditions.

  14. Supernova Cosmology Inference with Probabilistic Photometric Redshifts (SCIPPR)

    NASA Astrophysics Data System (ADS)

    Peters, Christina; Malz, Alex; Hlozek, Renée

    2018-01-01

    The Bayesian Estimation Applied to Multiple Species (BEAMS) framework employs probabilistic supernova type classifications to do photometric SN cosmology. This work extends BEAMS to replace high-confidence spectroscopic redshifts with photometric redshift probability density functions, a capability that will be essential in the era the Large Synoptic Survey Telescope and other next-generation photometric surveys where it will not be possible to perform spectroscopic follow up on every SN. We present the Supernova Cosmology Inference with Probabilistic Photometric Redshifts (SCIPPR) Bayesian hierarchical model for constraining the cosmological parameters from photometric lightcurves and host galaxy photometry, which includes selection effects and is extensible to uncertainty in the redshift-dependent supernova type proportions. We create a pair of realistic mock catalogs of joint posteriors over supernova type, redshift, and distance modulus informed by photometric supernova lightcurves and over redshift from simulated host galaxy photometry. We perform inference under our model to obtain a joint posterior probability distribution over the cosmological parameters and compare our results with other methods, namely: a spectroscopic subset, a subset of high probability photometrically classified supernovae, and reducing the photometric redshift probability to a single measurement and error bar.

  15. Geospatial risk assessment and trace element concentration in reef associated sediments, northern part of Gulf of Mannar biosphere reserve, Southeast Coast of India.

    PubMed

    Krishnakumar, S; Ramasamy, S; Simon Peter, T; Godson, Prince S; Chandrasekar, N; Magesh, N S

    2017-12-15

    Fifty two surface sediments were collected from the northern part of the Gulf of Mannar biosphere reserve to assess the geospatial risk of sediments. We found that distribution of organic matter and CaCO 3 distributions were locally controlled by the mangrove litters and fragmented coral debris. In addition, Fe and Mn concentrations in the marine sediments were probably supplied through the riverine input and natural processes. The Geo-accumulation of elements fall under the uncontaminated category except Pb. Lead show a wide range of contamination from uncontaminated-moderately contaminated to extremely contaminated category. The sediment toxicity level of the elements revealed that the majority of the sediments fall under moderately to highly polluted sediments (23.07-28.84%). The grades of potential ecological risk suggest that predominant sediments fall under low to moderate risk category (55.7-32.7%). The accumulation level of trace elements clearly suggests that the coral reef ecosystem is under low to moderate risk. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Novel Data on the Ecology of Cochranella mache (Anura: Centrolenidae) and the Importance of Protected Areas for This Critically Endangered Glassfrog in the Neotropics

    PubMed Central

    Ortega-Andrade, H. Mauricio; Rojas-Soto, Octavio; Paucar, Christian

    2013-01-01

    We studied a population of the endangered glassfrog, Cochranella mache, at Bilsa Biological Station, northwestern Ecuador, from 2008 and 2009. We present information on annual abundance patterns, behavioral ecology, habitat use and a species distribution model performed with MaxEnt. We evaluate the importance of the National System of Protected Areas (SNAP) in Colombia and Ecuador, under scenarios of climate change and habitat loss. We predicted a restricted environmental suitability area from 48,509 Km2 to 65,147 Km2 along western Ecuador and adjacent Colombia; ∼8% of the potential distribution occurs within SNAP. We examined four aspects of C. mache ecology: (1) ecological data suggests a strong correlation between relative abundance and rainfall, with a high probability to observe frogs through rainy months (February–May); (2) habitat use and the species distribution model suggest that this canopy dweller is restricted to small streams and rivulets in primary and old secondary forest in evergreen lowland and piedmont forest of western Ecuador, with predictions of suitability areas in adjacent southern Colombia; (3) the SNAP of Colombia and Ecuador harbor a minimum portion of the predicted model of distribution (<10%); and (4) synergetic effects of habitat loss and climate change reduces in about 95% the suitability areas for this endangered frog along its distributional range in Protected Areas. The resulting model allows the recognition of areas to undertake conservation efforts and plan future field surveys, as well as forecasting regions with high probability of C. mache occurrence in western Ecuador and southern Colombia. Further research is required to assess population tendencies, habitat fragmentation and target survey zones to accelerate the discovery of unknown populations in unexplored areas with high probability of suitability. We recommend that Cochranella mache must be re-categorized as “Critically Endangered” species in national and global status, according with criteria and sub-criteria A4, B1ab(i,ii,iii,iv),E. PMID:24339973

  17. A Statistical Study of the Mass Distribution of Neutron Stars

    NASA Astrophysics Data System (ADS)

    Cheng, Zheng; Zhang, Cheng-Min; Zhao, Yong-Heng; Wang, De-Hua; Pan, Yuan-Yue; Lei, Ya-Juan

    2014-07-01

    By reviewing the methods of mass measurements of neutron stars in four different kinds of systems, i.e., the high-mass X-ray binaries (HMXBs), low-mass X-ray binaries (LMXBs), double neutron star systems (DNSs) and neutron star-white dwarf (NS-WD) binary systems, we have collected the orbital parameters of 40 systems. By using the boot-strap method and the Monte-Carlo method, we have rebuilt the likelihood probability curves of the measured masses of 46 neutron stars. The statistical analysis of the simulation results shows that the masses of neutron stars in the X-ray neutron star systems and those in the radio pulsar systems exhibit different distributions. Besides, the Bayes statistics of these four different kind systems yields the most-probable probability density distributions of these four kind systems to be (1.340 ± 0.230)M8, (1, 505 ± 0.125)M8,(1.335 ± 0.055)M8 and (1.495 ± 0.225)M8, respectively. It is noteworthy that the masses of neutron stars in the HMXB and DNS systems are smaller than those in the other two kind systems by approximately 0.16M8. This result is consistent with the theoretical model of the pulsar to be accelerated to the millisecond order of magnitude via accretion of approximately 0.2M8. If the HMXBs and LMXBs are respectively taken to be the precursors of the BNS and NS-WD systems, then the influence of the accretion effect on the masses of neutron stars in the HMXB systems should be exceedingly small. Their mass distributions should be very close to the initial one during the formation of neutron stars. As for the LMXB and NS-WD systems, they should have already under- gone the process of suffcient accretion, hence there arises rather large deviation from the initial mass distribution.

  18. An estimation method of the direct benefit of a waterlogging control project applicable to the changing environment

    NASA Astrophysics Data System (ADS)

    Zengmei, L.; Guanghua, Q.; Zishen, C.

    2015-05-01

    The direct benefit of a waterlogging control project is reflected by the reduction or avoidance of waterlogging loss. Before and after the construction of a waterlogging control project, the disaster-inducing environment in the waterlogging-prone zone is generally different. In addition, the category, quantity and spatial distribution of the disaster-bearing bodies are also changed more or less. Therefore, under the changing environment, the direct benefit of a waterlogging control project should be the reduction of waterlogging losses compared to conditions with no control project. Moreover, the waterlogging losses with or without the project should be the mathematical expectations of the waterlogging losses when rainstorms of all frequencies meet various water levels in the drainage-accepting zone. So an estimation model of the direct benefit of waterlogging control is proposed. Firstly, on the basis of a Copula function, the joint distribution of the rainstorms and the water levels are established, so as to obtain their joint probability density function. Secondly, according to the two-dimensional joint probability density distribution, the dimensional domain of integration is determined, which is then divided into small domains so as to calculate the probability for each of the small domains and the difference between the average waterlogging loss with and without a waterlogging control project, called the regional benefit of waterlogging control project, under the condition that rainstorms in the waterlogging-prone zone meet the water level in the drainage-accepting zone. Finally, it calculates the weighted mean of the project benefit of all small domains, with probability as the weight, and gets the benefit of the waterlogging control project. Taking the estimation of benefit of a waterlogging control project in Yangshan County, Guangdong Province, as an example, the paper briefly explains the procedures in waterlogging control project benefit estimation. The results show that the waterlogging control benefit estimation model constructed is applicable to the changing conditions that occur in both the disaster-inducing environment of the waterlogging-prone zone and disaster-bearing bodies, considering all conditions when rainstorms of all frequencies meet different water levels in the drainage-accepting zone. Thus, the estimation method of waterlogging control benefit can reflect the actual situation more objectively, and offer a scientific basis for rational decision-making for waterlogging control projects.

  19. A consistent NPMLE of the joint distribution function with competing risks data under the dependent masking and right-censoring model.

    PubMed

    Li, Jiahui; Yu, Qiqing

    2016-01-01

    Dinse (Biometrics, 38:417-431, 1982) provides a special type of right-censored and masked competing risks data and proposes a non-parametric maximum likelihood estimator (NPMLE) and a pseudo MLE of the joint distribution function [Formula: see text] with such data. However, their asymptotic properties have not been studied so far. Under the extention of either the conditional masking probability (CMP) model or the random partition masking (RPM) model (Yu and Li, J Nonparametr Stat 24:753-764, 2012), we show that (1) Dinse's estimators are consistent if [Formula: see text] takes on finitely many values and each point in the support set of [Formula: see text] can be observed; (2) if the failure time is continuous, the NPMLE is not uniquely determined, and the standard approach (which puts weights only on one element in each observed set) leads to an inconsistent NPMLE; (3) in general, Dinse's estimators are not consistent even under the discrete assumption; (4) we construct a consistent NPMLE. The consistency is given under a new model called dependent masking and right-censoring model. The CMP model and the RPM model are indeed special cases of the new model. We compare our estimator to Dinse's estimators through simulation and real data. Simulation study indicates that the consistent NPMLE is a good approximation to the underlying distribution for moderate sample sizes.

  20. Fragment size distribution in viscous bag breakup of a drop

    NASA Astrophysics Data System (ADS)

    Kulkarni, Varun; Bulusu, Kartik V.; Plesniak, Michael W.; Sojka, Paul E.

    2015-11-01

    In this study we examine the drop size distribution resulting from the fragmentation of a single drop in the presence of a continuous air jet. Specifically, we study the effect of Weber number, We, and Ohnesorge number, Oh on the disintegration process. The regime of breakup considered is observed between 12 <= We <= 16 for Oh <= 0.1. Experiments are conducted using phase Doppler anemometry. Both the number and volume fragment size probability distributions are plotted. The volume probability distribution revealed a bi-modal behavior with two distinct peaks: one corresponding to the rim fragments and the other to the bag fragments. This behavior was suppressed in the number probability distribution. Additionally, we employ an in-house particle detection code to isolate the rim fragment size distribution from the total probability distributions. Our experiments showed that the bag fragments are smaller in diameter and larger in number, while the rim fragments are larger in diameter and smaller in number. Furthermore, with increasing We for a given Ohwe observe a large number of small-diameter drops and small number of large-diameter drops. On the other hand, with increasing Oh for a fixed We the opposite is seen.

  1. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.

    An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.

  2. Score distributions of gapped multiple sequence alignments down to the low-probability tail

    NASA Astrophysics Data System (ADS)

    Fieth, Pascal; Hartmann, Alexander K.

    2016-08-01

    Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.

  3. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  4. A Bayesian Approach for Sensor Optimisation in Impact Identification

    PubMed Central

    Mallardo, Vincenzo; Sharif Khodaei, Zahra; Aliabadi, Ferri M. H.

    2016-01-01

    This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM) system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence. PMID:28774064

  5. Dynamic phase transitions of the Blume-Emery-Griffiths model under an oscillating external magnetic field by the path probability method

    NASA Astrophysics Data System (ADS)

    Ertaş, Mehmet; Keskin, Mustafa

    2015-03-01

    By using the path probability method (PPM) with point distribution, we study the dynamic phase transitions (DPTs) in the Blume-Emery-Griffiths (BEG) model under an oscillating external magnetic field. The phases in the model are obtained by solving the dynamic equations for the average order parameters and a disordered phase, ordered phase and four mixed phases are found. We also investigate the thermal behavior of the dynamic order parameters to analyze the nature dynamic transitions as well as to obtain the DPT temperatures. The dynamic phase diagrams are presented in three different planes in which exhibit the dynamic tricritical point, double critical end point, critical end point, quadrupole point, triple point as well as the reentrant behavior, strongly depending on the values of the system parameters. We compare and discuss the dynamic phase diagrams with dynamic phase diagrams that were obtained within the Glauber-type stochastic dynamics based on the mean-field theory.

  6. A Bayesian-frequentist two-stage single-arm phase II clinical trial design.

    PubMed

    Dong, Gaohong; Shih, Weichung Joe; Moore, Dirk; Quan, Hui; Marcella, Stephen

    2012-08-30

    It is well-known that both frequentist and Bayesian clinical trial designs have their own advantages and disadvantages. To have better properties inherited from these two types of designs, we developed a Bayesian-frequentist two-stage single-arm phase II clinical trial design. This design allows both early acceptance and rejection of the null hypothesis ( H(0) ). The measures (for example probability of trial early termination, expected sample size, etc.) of the design properties under both frequentist and Bayesian settings are derived. Moreover, under the Bayesian setting, the upper and lower boundaries are determined with predictive probability of trial success outcome. Given a beta prior and a sample size for stage I, based on the marginal distribution of the responses at stage I, we derived Bayesian Type I and Type II error rates. By controlling both frequentist and Bayesian error rates, the Bayesian-frequentist two-stage design has special features compared with other two-stage designs. Copyright © 2012 John Wiley & Sons, Ltd.

  7. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.

    2012-06-15

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less

  8. A stochastic model for the probability of malaria extinction by mass drug administration.

    PubMed

    Pemberton-Ross, Peter; Chitnis, Nakul; Pothin, Emilie; Smith, Thomas A

    2017-09-18

    Mass drug administration (MDA) has been proposed as an intervention to achieve local extinction of malaria. Although its effect on the reproduction number is short lived, extinction may subsequently occur in a small population due to stochastic fluctuations. This paper examines how the probability of stochastic extinction depends on population size, MDA coverage and the reproduction number under control, R c . A simple compartmental model is developed which is used to compute the probability of extinction using probability generating functions. The expected time to extinction in small populations after MDA for various scenarios in this model is calculated analytically. The results indicate that mass drug administration (Firstly, R c must be sustained at R c  < 1.2 to avoid the rapid re-establishment of infections in the population. Secondly, the MDA must produce effective cure rates of >95% to have a non-negligible probability of successful elimination. Stochastic fluctuations only significantly affect the probability of extinction in populations of about 1000 individuals or less. The expected time to extinction via stochastic fluctuation is less than 10 years only in populations less than about 150 individuals. Clustering of secondary infections and of MDA distribution both contribute positively to the potential probability of success, indicating that MDA would most effectively be administered at the household level. There are very limited circumstances in which MDA will lead to local malaria elimination with a substantial probability.

  9. Seasonal climate variation and caribou availability: Modeling sequential movement using satellite-relocation data

    USGS Publications Warehouse

    Nicolson, Craig; Berman, Matthew; West, Colin Thor; Kofinas, Gary P.; Griffith, Brad; Russell, Don; Dugan, Darcy

    2013-01-01

    Livelihood systems that depend on mobile resources must constantly adapt to change. For people living in permanent settlements, environmental changes that affect the distribution of a migratory species may reduce the availability of a primary food source, with the potential to destabilize the regional social-ecological system. Food security for Arctic indigenous peoples harvesting barren ground caribou (Rangifer tarandus granti) depends on movement patterns of migratory herds. Quantitative assessments of physical, ecological, and social effects on caribou distribution have proven difficult because of the significant interannual variability in seasonal caribou movement patterns. We developed and evaluated a modeling approach for simulating the distribution of a migratory herd throughout its annual cycle over a multiyear period. Beginning with spatial and temporal scales developed in previous studies of the Porcupine Caribou Herd of Canada and Alaska, we used satellite collar locations to compute and analyze season-by-season probabilities of movement of animals between habitat zones under two alternative weather conditions for each season. We then built a set of transition matrices from these movement probabilities, and simulated the sequence of movements across the landscape as a Markov process driven by externally imposed seasonal weather states. Statistical tests showed that the predicted distributions of caribou were consistent with observed distributions, and significantly correlated with subsistence harvest levels for three user communities. Our approach could be applied to other caribou herds and could be adapted for simulating the distribution of other ungulates and species with similarly large interannual variability in the use of their range.

  10. Probability of success for phase III after exploratory biomarker analysis in phase II.

    PubMed

    Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver

    2017-05-01

    The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.

  11. General formulation of long-range degree correlations in complex networks

    NASA Astrophysics Data System (ADS)

    Fujiki, Yuka; Takaguchi, Taro; Yakubo, Kousuke

    2018-06-01

    We provide a general framework for analyzing degree correlations between nodes separated by more than one step (i.e., beyond nearest neighbors) in complex networks. One joint and four conditional probability distributions are introduced to fully describe long-range degree correlations with respect to degrees k and k' of two nodes and shortest path length l between them. We present general relations among these probability distributions and clarify the relevance to nearest-neighbor degree correlations. Unlike nearest-neighbor correlations, some of these probability distributions are meaningful only in finite-size networks. Furthermore, as a baseline to determine the existence of intrinsic long-range degree correlations in a network other than inevitable correlations caused by the finite-size effect, the functional forms of these probability distributions for random networks are analytically evaluated within a mean-field approximation. The utility of our argument is demonstrated by applying it to real-world networks.

  12. Stochastic analysis of particle movement over a dune bed

    USGS Publications Warehouse

    Lee, Baum K.; Jobson, Harvey E.

    1977-01-01

    Stochastic models are available that can be used to predict the transport and dispersion of bed-material sediment particles in an alluvial channel. These models are based on the proposition that the movement of a single bed-material sediment particle consists of a series of steps of random length separated by rest periods of random duration and, therefore, application of the models requires a knowledge of the probability distributions of the step lengths, the rest periods, the elevation of particle deposition, and the elevation of particle erosion. The procedure was tested by determining distributions from bed profiles formed in a large laboratory flume with a coarse sand as the bed material. The elevation of particle deposition and the elevation of particle erosion can be considered to be identically distributed, and their distribution can be described by either a ' truncated Gaussian ' or a ' triangular ' density function. The conditional probability distribution of the rest period given the elevation of particle deposition closely followed the two-parameter gamma distribution. The conditional probability distribution of the step length given the elevation of particle erosion and the elevation of particle deposition also closely followed the two-parameter gamma density function. For a given flow, the scale and shape parameters describing the gamma probability distributions can be expressed as functions of bed-elevation. (Woodard-USGS)

  13. Average BER of subcarrier intensity modulated free space optical systems over the exponentiated Weibull fading channels.

    PubMed

    Wang, Ping; Zhang, Lu; Guo, Lixin; Huang, Feng; Shang, Tao; Wang, Ranran; Yang, Yintang

    2014-08-25

    The average bit error rate (BER) for binary phase-shift keying (BPSK) modulation in free-space optical (FSO) links over turbulence atmosphere modeled by the exponentiated Weibull (EW) distribution is investigated in detail. The effects of aperture averaging on the average BERs for BPSK modulation under weak-to-strong turbulence conditions are studied. The average BERs of EW distribution are compared with Lognormal (LN) and Gamma-Gamma (GG) distributions in weak and strong turbulence atmosphere, respectively. The outage probability is also obtained for different turbulence strengths and receiver aperture sizes. The analytical results deduced by the generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulation. This work is helpful for the design of receivers for FSO communication systems.

  14. Photocounting distributions for exponentially decaying sources.

    PubMed

    Teich, M C; Card, H C

    1979-05-01

    Exact photocounting distributions are obtained for a pulse of light whose intensity is exponentially decaying in time, when the underlying photon statistics are Poisson. It is assumed that the starting time for the sampling interval (which is of arbitrary duration) is uniformly distributed. The probability of registering n counts in the fixed time T is given in terms of the incomplete gamma function for n >/= 1 and in terms of the exponential integral for n = 0. Simple closed-form expressions are obtained for the count mean and variance. The results are expected to be of interest in certain studies involving spontaneous emission, radiation damage in solids, and nuclear counting. They will also be useful in neurobiology and psychophysics, since habituation and sensitization processes may sometimes be characterized by the same stochastic model.

  15. Using type IV Pearson distribution to calculate the probabilities of underrun and overrun of lists of multiple cases.

    PubMed

    Wang, Jihan; Yang, Kai

    2014-07-01

    An efficient operating room needs both little underutilised and overutilised time to achieve optimal cost efficiency. The probabilities of underrun and overrun of lists of cases can be estimated by a well defined duration distribution of the lists. To propose a method of predicting the probabilities of underrun and overrun of lists of cases using Type IV Pearson distribution to support case scheduling. Six years of data were collected. The first 5 years of data were used to fit distributions and estimate parameters. The data from the last year were used as testing data to validate the proposed methods. The percentiles of the duration distribution of lists of cases were calculated by Type IV Pearson distribution and t-distribution. Monte Carlo simulation was conducted to verify the accuracy of percentiles defined by the proposed methods. Operating rooms in John D. Dingell VA Medical Center, United States, from January 2005 to December 2011. Differences between the proportion of lists of cases that were completed within the percentiles of the proposed duration distribution of the lists and the corresponding percentiles. Compared with the t-distribution, the proposed new distribution is 8.31% (0.38) more accurate on average and 14.16% (0.19) more accurate in calculating the probabilities at the 10th and 90th percentiles of the distribution, which is a major concern of operating room schedulers. The absolute deviations between the percentiles defined by Type IV Pearson distribution and those from Monte Carlo simulation varied from 0.20  min (0.01) to 0.43  min (0.03). Operating room schedulers can rely on the most recent 10 cases with the same combination of surgeon and procedure(s) for distribution parameter estimation to plan lists of cases. Values are mean (SEM). The proposed Type IV Pearson distribution is more accurate than t-distribution to estimate the probabilities of underrun and overrun of lists of cases. However, as not all the individual case durations followed log-normal distributions, there was some deviation from the true duration distribution of the lists.

  16. Maximum entropy principal for transportation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bilich, F.; Da Silva, R.

    In this work we deal with modeling of the transportation phenomenon for use in the transportation planning process and policy-impact studies. The model developed is based on the dependence concept, i.e., the notion that the probability of a trip starting at origin i is dependent on the probability of a trip ending at destination j given that the factors (such as travel time, cost, etc.) which affect travel between origin i and destination j assume some specific values. The derivation of the solution of the model employs the maximum entropy principle combining a priori multinomial distribution with a trip utilitymore » concept. This model is utilized to forecast trip distributions under a variety of policy changes and scenarios. The dependence coefficients are obtained from a regression equation where the functional form is derived based on conditional probability and perception of factors from experimental psychology. The dependence coefficients encode all the information that was previously encoded in the form of constraints. In addition, the dependence coefficients encode information that cannot be expressed in the form of constraints for practical reasons, namely, computational tractability. The equivalence between the standard formulation (i.e., objective function with constraints) and the dependence formulation (i.e., without constraints) is demonstrated. The parameters of the dependence-based trip-distribution model are estimated, and the model is also validated using commercial air travel data in the U.S. In addition, policy impact analyses (such as allowance of supersonic flights inside the U.S. and user surcharge at noise-impacted airports) on air travel are performed.« less

  17. Intra-Urban Human Mobility and Activity Transition: Evidence from Social Media Check-In Data

    PubMed Central

    Wu, Lun; Zhi, Ye; Sui, Zhengwei; Liu, Yu

    2014-01-01

    Most existing human mobility literature focuses on exterior characteristics of movements but neglects activities, the driving force that underlies human movements. In this research, we combine activity-based analysis with a movement-based approach to model the intra-urban human mobility observed from about 15 million check-in records during a yearlong period in Shanghai, China. The proposed model is activity-based and includes two parts: the transition of travel demands during a specific time period and the movement between locations. For the first part, we find the transition probability between activities varies over time, and then we construct a temporal transition probability matrix to represent the transition probability of travel demands during a time interval. For the second part, we suggest that the travel demands can be divided into two classes, locationally mandatory activity (LMA) and locationally stochastic activity (LSA), according to whether the demand is associated with fixed location or not. By judging the combination of predecessor activity type and successor activity type we determine three trip patterns, each associated with a different decay parameter. To validate the model, we adopt the mechanism of an agent-based model and compare the simulated results with the observed pattern from the displacement distance distribution, the spatio-temporal distribution of activities, and the temporal distribution of travel demand transitions. The results show that the simulated patterns fit the observed data well, indicating that these findings open new directions for combining activity-based analysis with a movement-based approach using social media check-in data. PMID:24824892

  18. A rapid local singularity analysis algorithm with applications

    NASA Astrophysics Data System (ADS)

    Chen, Zhijun; Cheng, Qiuming; Agterberg, Frits

    2015-04-01

    The local singularity model developed by Cheng is fast gaining popularity in characterizing mineralization and detecting anomalies of geochemical, geophysical and remote sensing data. However in one of the conventional algorithms involving the moving average values with different scales is time-consuming especially while analyzing a large dataset. Summed area table (SAT), also called as integral image, is a fast algorithm used within the Viola-Jones object detection framework in computer vision area. Historically, the principle of SAT is well-known in the study of multi-dimensional probability distribution functions, namely in computing 2D (or ND) probabilities (area under the probability distribution) from the respective cumulative distribution functions. We introduce SAT and it's variation Rotated Summed Area Table in the isotropic, anisotropic or directional local singularity mapping in this study. Once computed using SAT, any one of the rectangular sum can be computed at any scale or location in constant time. The area for any rectangular region in the image can be computed by using only 4 array accesses in constant time independently of the size of the region; effectively reducing the time complexity from O(n) to O(1). New programs using Python, Julia, matlab and C++ are implemented respectively to satisfy different applications, especially to the big data analysis. Several large geochemical and remote sensing datasets are tested. A wide variety of scale changes (linear spacing or log spacing) for non-iterative or iterative approach are adopted to calculate the singularity index values and compare the results. The results indicate that the local singularity analysis with SAT is more robust and superior to traditional approach in identifying anomalies.

  19. Using Atmospheric Circulation Patterns to Detect and Attribute Changes in the Risk of Extreme Climate Events

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.; Horton, D. E.; Singh, D.; Swain, D. L.; Touma, D. E.; Mankin, J. S.

    2015-12-01

    Because of the high cost of extreme events and the growing evidence that global warming is likely to alter the statistical distribution of climate variables, detection and attribution of changes in the probability of extreme climate events has become a pressing topic for the scientific community, elected officials, and the public. While most of the emphasis has thus far focused on analyzing the climate variable of interest (most often temperature or precipitation, but also flooding and drought), there is an emerging emphasis on applying detection and attribution analysis techniques to the underlying physical causes of individual extreme events. This approach is promising in part because the underlying physical causes (such as atmospheric circulation patterns) can in some cases be more accurately represented in climate models than the more proximal climate variable (such as precipitation). In addition, and more scientifically critical, is the fact that the most extreme events result from a rare combination of interacting causes, often referred to as "ingredients". Rare events will therefore always have a strong influence of "natural" variability. Analyzing the underlying physical mechanisms can therefore help to test whether there have been changes in the probability of the constituent conditions of an individual event, or whether the co-occurrence of causal conditions cannot be distinguished from random chance. This presentation will review approaches to applying detection/attribution analysis to the underlying physical causes of extreme events (including both "thermodynamic" and "dynamic" causes), and provide a number of case studies, including the role of frequency of atmospheric circulation patterns in the probability of hot, cold, wet and dry events.

  20. A tool for simulating collision probabilities of animals with marine renewable energy devices.

    PubMed

    Schmitt, Pál; Culloch, Ross; Lieber, Lilian; Molander, Sverker; Hammar, Linus; Kregting, Louise

    2017-01-01

    The mathematical problem of establishing a collision probability distribution is often not trivial. The shape and motion of the animal as well as of the the device must be evaluated in a four-dimensional space (3D motion over time). Earlier work on wind and tidal turbines was limited to a simplified two-dimensional representation, which cannot be applied to many new structures. We present a numerical algorithm to obtain such probability distributions using transient, three-dimensional numerical simulations. The method is demonstrated using a sub-surface tidal kite as an example. Necessary pre- and post-processing of the data created by the model is explained, numerical details and potential issues and limitations in the application of resulting probability distributions are highlighted.

  1. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    PubMed

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  2. Quantum key distribution without the wavefunction

    NASA Astrophysics Data System (ADS)

    Niestegge, Gerd

    A well-known feature of quantum mechanics is the secure exchange of secret bit strings which can then be used as keys to encrypt messages transmitted over any classical communication channel. It is demonstrated that this quantum key distribution allows a much more general and abstract access than commonly thought. The results include some generalizations of the Hilbert space version of quantum key distribution, but are based upon a general nonclassical extension of conditional probability. A special state-independent conditional probability is identified as origin of the superior security of quantum key distribution; this is a purely algebraic property of the quantum logic and represents the transition probability between the outcomes of two consecutive quantum measurements.

  3. The complexity of divisibility.

    PubMed

    Bausch, Johannes; Cubitt, Toby

    2016-09-01

    We address two sets of long-standing open questions in linear algebra and probability theory, from a computational complexity perspective: stochastic matrix divisibility, and divisibility and decomposability of probability distributions. We prove that finite divisibility of stochastic matrices is an NP-complete problem, and extend this result to nonnegative matrices, and completely-positive trace-preserving maps, i.e. the quantum analogue of stochastic matrices. We further prove a complexity hierarchy for the divisibility and decomposability of probability distributions, showing that finite distribution divisibility is in P, but decomposability is NP-hard. For the former, we give an explicit polynomial-time algorithm. All results on distributions extend to weak-membership formulations, proving that the complexity of these problems is robust to perturbations.

  4. Probability distributions of hydraulic conductivity for the hydrogeologic units of the Death Valley regional ground-water flow system, Nevada and California

    USGS Publications Warehouse

    Belcher, Wayne R.; Sweetkind, Donald S.; Elliott, Peggy E.

    2002-01-01

    The use of geologic information such as lithology and rock properties is important to constrain conceptual and numerical hydrogeologic models. This geologic information is difficult to apply explicitly to numerical modeling and analyses because it tends to be qualitative rather than quantitative. This study uses a compilation of hydraulic-conductivity measurements to derive estimates of the probability distributions for several hydrogeologic units within the Death Valley regional ground-water flow system, a geologically and hydrologically complex region underlain by basin-fill sediments, volcanic, intrusive, sedimentary, and metamorphic rocks. Probability distributions of hydraulic conductivity for general rock types have been studied previously; however, this study provides more detailed definition of hydrogeologic units based on lithostratigraphy, lithology, alteration, and fracturing and compares the probability distributions to the aquifer test data. Results suggest that these probability distributions can be used for studies involving, for example, numerical flow modeling, recharge, evapotranspiration, and rainfall runoff. These probability distributions can be used for such studies involving the hydrogeologic units in the region, as well as for similar rock types elsewhere. Within the study area, fracturing appears to have the greatest influence on the hydraulic conductivity of carbonate bedrock hydrogeologic units. Similar to earlier studies, we find that alteration and welding in the Tertiary volcanic rocks greatly influence hydraulic conductivity. As alteration increases, hydraulic conductivity tends to decrease. Increasing degrees of welding appears to increase hydraulic conductivity because welding increases the brittleness of the volcanic rocks, thus increasing the amount of fracturing.

  5. Relative performance of selected detectors

    NASA Astrophysics Data System (ADS)

    Ranney, Kenneth I.; Khatri, Hiralal; Nguyen, Lam H.; Sichina, Jeffrey

    2000-08-01

    The quadratic polynomial detector (QPD) and the radial basis function (RBF) family of detectors -- including the Bayesian neural network (BNN) -- might well be considered workhorses within the field of automatic target detection (ATD). The QPD works reasonably well when the data is unimodal, and it also achieves the best possible performance if the underlying data follow a Gaussian distribution. The BNN, on the other hand, has been applied successfully in cases where the underlying data are assumed to follow a multimodal distribution. We compare the performance of a BNN detector and a QPD for various scenarios synthesized from a set of Gaussian probability density functions (pdfs). This data synthesis allows us to control parameters such as modality and correlation, which, in turn, enables us to create data sets that can probe the weaknesses of the detectors. We present results for different data scenarios and different detector architectures.

  6. On buffer overflow duration in a finite-capacity queueing system with multiple vacation policy

    NASA Astrophysics Data System (ADS)

    Kempa, Wojciech M.

    2017-12-01

    A finite-buffer queueing system with Poisson arrivals and generally distributed processing times, operating under multiple vacation policy, is considered. Each time when the system becomes empty, the service station takes successive independent and identically distributed vacation periods, until, at the completion epoch of one of them, at least one job waiting for service is detected in the buffer. Applying analytical approach based on the idea of embedded Markov chain, integral equations and linear algebra, the compact-form representation for the cumulative distribution function (CDF for short) of the first buffer overflow duration is found. Hence, the formula for the CDF of next such periods is obtained. Moreover, probability distributions of the number of job losses in successive buffer overflow periods are found. The considered queueing system can be efficienly applied in modelling energy saving mechanisms in wireless network communication.

  7. Does Litter Size Variation Affect Models of Terrestrial Carnivore Extinction Risk and Management?

    PubMed Central

    Devenish-Nelson, Eleanor S.; Stephens, Philip A.; Harris, Stephen; Soulsbury, Carl; Richards, Shane A.

    2013-01-01

    Background Individual variation in both survival and reproduction has the potential to influence extinction risk. Especially for rare or threatened species, reliable population models should adequately incorporate demographic uncertainty. Here, we focus on an important form of demographic stochasticity: variation in litter sizes. We use terrestrial carnivores as an example taxon, as they are frequently threatened or of economic importance. Since data on intraspecific litter size variation are often sparse, it is unclear what probability distribution should be used to describe the pattern of litter size variation for multiparous carnivores. Methodology/Principal Findings We used litter size data on 32 terrestrial carnivore species to test the fit of 12 probability distributions. The influence of these distributions on quasi-extinction probabilities and the probability of successful disease control was then examined for three canid species – the island fox Urocyon littoralis, the red fox Vulpes vulpes, and the African wild dog Lycaon pictus. Best fitting probability distributions differed among the carnivores examined. However, the discretised normal distribution provided the best fit for the majority of species, because variation among litter-sizes was often small. Importantly, however, the outcomes of demographic models were generally robust to the distribution used. Conclusion/Significance These results provide reassurance for those using demographic modelling for the management of less studied carnivores in which litter size variation is estimated using data from species with similar reproductive attributes. PMID:23469140

  8. Does litter size variation affect models of terrestrial carnivore extinction risk and management?

    PubMed

    Devenish-Nelson, Eleanor S; Stephens, Philip A; Harris, Stephen; Soulsbury, Carl; Richards, Shane A

    2013-01-01

    Individual variation in both survival and reproduction has the potential to influence extinction risk. Especially for rare or threatened species, reliable population models should adequately incorporate demographic uncertainty. Here, we focus on an important form of demographic stochasticity: variation in litter sizes. We use terrestrial carnivores as an example taxon, as they are frequently threatened or of economic importance. Since data on intraspecific litter size variation are often sparse, it is unclear what probability distribution should be used to describe the pattern of litter size variation for multiparous carnivores. We used litter size data on 32 terrestrial carnivore species to test the fit of 12 probability distributions. The influence of these distributions on quasi-extinction probabilities and the probability of successful disease control was then examined for three canid species - the island fox Urocyon littoralis, the red fox Vulpes vulpes, and the African wild dog Lycaon pictus. Best fitting probability distributions differed among the carnivores examined. However, the discretised normal distribution provided the best fit for the majority of species, because variation among litter-sizes was often small. Importantly, however, the outcomes of demographic models were generally robust to the distribution used. These results provide reassurance for those using demographic modelling for the management of less studied carnivores in which litter size variation is estimated using data from species with similar reproductive attributes.

  9. Neighbor-Dependent Ramachandran Probability Distributions of Amino Acids Developed from a Hierarchical Dirichlet Process Model

    PubMed Central

    Mitra, Rajib; Jordan, Michael I.; Dunbrack, Roland L.

    2010-01-01

    Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1) input data size and criteria for structure inclusion (resolution, R-factor, etc.); 2) filtering of suspect conformations and outliers using B-factors or other features; 3) secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included); 4) the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5) whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately) have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp. PMID:20442867

  10. Probabilistic analysis of preload in the abutment screw of a dental implant complex.

    PubMed

    Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R

    2008-09-01

    Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment screw at high values of the preload CDF. Lubrication at the threaded surfaces between the abutment screw and implant bore affects the preload developed in the implant complex. For the well-lubricated surfaces, only approximately 50% of implants will have preload values within the generally accepted range. This probability can be improved by applying a higher torque than normally recommended or a more closely controlled torque than typically achieved. It is also suggested that materials with higher elastic moduli be used in the manufacture of the abutment screw to achieve a higher preload.

  11. Comparative analysis through probability distributions of a data set

    NASA Astrophysics Data System (ADS)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  12. Impact of temporal probability in 4D dose calculation for lung tumors.

    PubMed

    Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi

    2015-11-08

    The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can approximate four-dimensional dose computed using the patient-specific respiratory trace.

  13. Drilling High Precision Holes in Ti6Al4V Using Rotary Ultrasonic Machining and Uncertainties Underlying Cutting Force, Tool Wear, and Production Inaccuracies.

    PubMed

    Chowdhury, M A K; Sharif Ullah, A M M; Anwar, Saqib

    2017-09-12

    Ti6Al4V alloys are difficult-to-cut materials that have extensive applications in the automotive and aerospace industry. A great deal of effort has been made to develop and improve the machining operations of Ti6Al4V alloys. This paper presents an experimental study that systematically analyzes the effects of the machining conditions (ultrasonic power, feed rate, spindle speed, and tool diameter) on the performance parameters (cutting force, tool wear, overcut error, and cylindricity error), while drilling high precision holes on the workpiece made of Ti6Al4V alloys using rotary ultrasonic machining (RUM). Numerical results were obtained by conducting experiments following the design of an experiment procedure. The effects of the machining conditions on each performance parameter have been determined by constructing a set of possibility distributions (i.e., trapezoidal fuzzy numbers) from the experimental data. A possibility distribution is a probability-distribution-neural representation of uncertainty, and is effective in quantifying the uncertainty underlying physical quantities when there is a limited number of data points which is the case here. Lastly, the optimal machining conditions have been identified using these possibility distributions.

  14. Robust DEA under discrete uncertain data: a case study of Iranian electricity distribution companies

    NASA Astrophysics Data System (ADS)

    Hafezalkotob, Ashkan; Haji-Sami, Elham; Omrani, Hashem

    2015-06-01

    Crisp input and output data are fundamentally indispensable in traditional data envelopment analysis (DEA). However, the real-world problems often deal with imprecise or ambiguous data. In this paper, we propose a novel robust data envelopment model (RDEA) to investigate the efficiencies of decision-making units (DMU) when there are discrete uncertain input and output data. The method is based upon the discrete robust optimization approaches proposed by Mulvey et al. (1995) that utilizes probable scenarios to capture the effect of ambiguous data in the case study. Our primary concern in this research is evaluating electricity distribution companies under uncertainty about input/output data. To illustrate the ability of proposed model, a numerical example of 38 Iranian electricity distribution companies is investigated. There are a large amount ambiguous data about these companies. Some electricity distribution companies may not report clear and real statistics to the government. Thus, it is needed to utilize a prominent approach to deal with this uncertainty. The results reveal that the RDEA model is suitable and reliable for target setting based on decision makers (DM's) preferences when there are uncertain input/output data.

  15. Stochastic theory of fatigue corrosion

    NASA Astrophysics Data System (ADS)

    Hu, Haiyun

    1999-10-01

    A stochastic theory of corrosion has been constructed. The stochastic equations are described giving the transportation corrosion rate and fluctuation corrosion coefficient. In addition the pit diameter distribution function, the average pit diameter and the most probable pit diameter including other related empirical formula have been derived. In order to clarify the effect of stress range on the initiation and growth behaviour of pitting corrosion, round smooth specimen were tested under cyclic loading in 3.5% NaCl solution.

  16. Probabilistic Analysis of Algorithms for NP-Complete Problems

    DTIC Science & Technology

    1989-09-29

    LASSIFICATION OF THIS PAGE DTIC FILE COPY i PO ATO PAGEm ’ Forn Approvedii IONO PAGE I iMB NO. 07040188 .... "....... b . RESTRICTIVE MARKINGSECTE D...0790 3. DISTRIBUTION IAVAILABILITY OF REPORTAD-A217 880 -- ApprvdnrPU1l Qroo; B distr’ibutil unli mit od. .... .S. MONITORING...efficiently solves P in bouncded probability under D. I1 b ) A finds a solution to an instance of P chosen randomly according to D in time bounded by a

  17. What is epistemic value in free energy models of learning and acting? A bounded rationality perspective.

    PubMed

    Ortega, Pedro A; Braun, Daniel A

    2015-01-01

    Free energy models of learning and acting do not only care about utility or extrinsic value, but also about intrinsic value, that is, the information value stemming from probability distributions that represent beliefs or strategies. While these intrinsic values can be interpreted as epistemic values or exploration bonuses under certain conditions, the framework of bounded rationality offers a complementary interpretation in terms of information-processing costs that we discuss here.

  18. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    NASA Astrophysics Data System (ADS)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  19. Net present value probability distributions from decline curve reserves estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, D.E.; Huffman, C.H.; Thompson, R.S.

    1995-12-31

    This paper demonstrates how reserves probability distributions can be used to develop net present value (NPV) distributions. NPV probability distributions were developed from the rate and reserves distributions presented in SPE 28333. This real data study used practicing engineer`s evaluations of production histories. Two approaches were examined to quantify portfolio risk. The first approach, the NPV Relative Risk Plot, compares the mean NPV with the NPV relative risk ratio for the portfolio. The relative risk ratio is the NPV standard deviation (a) divided the mean ({mu}) NPV. The second approach, a Risk - Return Plot, is a plot of themore » {mu} discounted cash flow rate of return (DCFROR) versus the {sigma} for the DCFROR distribution. This plot provides a risk-return relationship for comparing various portfolios. These methods may help evaluate property acquisition and divestiture alternatives and assess the relative risk of a suite of wells or fields for bank loans.« less

  20. Optimal random search for a single hidden target.

    PubMed

    Snider, Joseph

    2011-01-01

    A single target is hidden at a location chosen from a predetermined probability distribution. Then, a searcher must find a second probability distribution from which random search points are sampled such that the target is found in the minimum number of trials. Here it will be shown that if the searcher must get very close to the target to find it, then the best search distribution is proportional to the square root of the target distribution regardless of dimension. For a Gaussian target distribution, the optimum search distribution is approximately a Gaussian with a standard deviation that varies inversely with how close the searcher must be to the target to find it. For a network where the searcher randomly samples nodes and looks for the fixed target along edges, the optimum is either to sample a node with probability proportional to the square root of the out-degree plus 1 or not to do so at all.

  1. Errors in the estimation of the variance: implications for multiple-probability fluctuation analysis.

    PubMed

    Saviane, Chiara; Silver, R Angus

    2006-06-15

    Synapses play a crucial role in information processing in the brain. Amplitude fluctuations of synaptic responses can be used to extract information about the mechanisms underlying synaptic transmission and its modulation. In particular, multiple-probability fluctuation analysis can be used to estimate the number of functional release sites, the mean probability of release and the amplitude of the mean quantal response from fits of the relationship between the variance and mean amplitude of postsynaptic responses, recorded at different probabilities. To determine these quantal parameters, calculate their uncertainties and the goodness-of-fit of the model, it is important to weight the contribution of each data point in the fitting procedure. We therefore investigated the errors associated with measuring the variance by determining the best estimators of the variance of the variance and have used simulations of synaptic transmission to test their accuracy and reliability under different experimental conditions. For central synapses, which generally have a low number of release sites, the amplitude distribution of synaptic responses is not normal, thus the use of a theoretical variance of the variance based on the normal assumption is not a good approximation. However, appropriate estimators can be derived for the population and for limited sample sizes using a more general expression that involves higher moments and introducing unbiased estimators based on the h-statistics. Our results are likely to be relevant for various applications of fluctuation analysis when few channels or release sites are present.

  2. A new model of the lunar ejecta cloud

    NASA Astrophysics Data System (ADS)

    Christou, A. A.

    2014-04-01

    Every airless body in the solar system is surrounded by a cloud of ejecta produced by the impact of interplanetary meteoroids on its surface [1]. Such "dust exospheres" have been observed around the Galilean satellites of Jupiter [2, 3]. The prospect of long-term robotic and human operations on the Moon by the US and other countries has rekindled interest on the subject [4]. This interest has culminated with the recent investigation of the Moon's dust exosphere by the LADEE spacecraft [5]. Here a model is presented of a ballistic, collisionless, steady state population of ejecta launched vertically at randomly distributed times and velocities. Assuming a uniform distribution of launch times I derive closed form solutions for the probability density functions (pdfs) of the height distribution of particles and the distribution of their speeds in a rest frame both at the surface and at altitude. The treatment is then extended to particle motion with respect to a moving platform such as an orbiting spacecraft. These expressions are compared with numerical simulations under lunar surface gravity where the underlying ejection speed distribution is (a) uniform (b) a power law. I discuss the predictions of the model, its limitations, and how it can be validated against near-surface and orbital measurements.

  3. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was computed over multiple scales. This slope analysis showed that local slope distributions are non-Gaussian for both crater walls and floors. Over larger baselines (~100 meters), crater wall slope probability distributions do approximate Gaussian distributions better, but have long distribution tails. Crater floor probability distributions however, were always asymmetric (for the baseline scales analyzed) and less affected by baseline scale variations. Accordingly, our results suggest that use of long tailed probability distributions (like Cauchy) and a baseline-dependant multi-scale model can be more effective in describing the slope statistics for lunar topography. Refrences: [1]Moore, H.(1971), JGR,75(11) [2]Marcus, A. H.(1969),JGR,74 (22).[3]R.J. Pike (1970),U.S. Geological Survey Working Paper [4]N. C. Costes, J. E. Farmer and E. B. George (1972),NASA Technical Report TR R-401 [5]M. N. Parker and G. L. Tyler(1973), Radio Science, 8(3),177-184 [6]Alekseev, V. A.et al (1968), Soviet Astronomy, Vol. 11, p.860 [7]Burns et al. (2012) Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIX-B4, 483-488.[8]Smith et al. (2010) GRL 37, L18204, DOI: 10.1029/2010GL043751. [9]Wagner R., Robinson, M., Speyerer E., Mahanti, P., LPSC 2013, #2924.

  4. Development and application of an empirical probability distribution for the prediction error of re-entry body maximum dynamic pressure

    NASA Technical Reports Server (NTRS)

    Lanzi, R. James; Vincent, Brett T.

    1993-01-01

    The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.

  5. Probability and the changing shape of response distributions for orientation.

    PubMed

    Anderson, Britt

    2014-11-18

    Spatial attention and feature-based attention are regarded as two independent mechanisms for biasing the processing of sensory stimuli. Feature attention is held to be a spatially invariant mechanism that advantages a single feature per sensory dimension. In contrast to the prediction of location independence, I found that participants were able to report the orientation of a briefly presented visual grating better for targets defined by high probability conjunctions of features and locations even when orientations and locations were individually uniform. The advantage for high-probability conjunctions was accompanied by changes in the shape of the response distributions. High-probability conjunctions had error distributions that were not normally distributed but demonstrated increased kurtosis. The increase in kurtosis could be explained as a change in the variances of the component tuning functions that comprise a population mixture. By changing the mixture distribution of orientation-tuned neurons, it is possible to change the shape of the discrimination function. This prompts the suggestion that attention may not "increase" the quality of perceptual processing in an absolute sense but rather prioritizes some stimuli over others. This results in an increased number of highly accurate responses to probable targets and, simultaneously, an increase in the number of very inaccurate responses. © 2014 ARVO.

  6. Vector wind and vector wind shear models 0 to 27 km altitude for Cape Kennedy, Florida, and Vandenberg AFB, California

    NASA Technical Reports Server (NTRS)

    Smith, O. E.

    1976-01-01

    The techniques are presented to derive several statistical wind models. The techniques are from the properties of the multivariate normal probability function. Assuming that the winds can be considered as bivariate normally distributed, then (1) the wind components and conditional wind components are univariate normally distributed, (2) the wind speed is Rayleigh distributed, (3) the conditional distribution of wind speed given a wind direction is Rayleigh distributed, and (4) the frequency of wind direction can be derived. All of these distributions are derived from the 5-sample parameter of wind for the bivariate normal distribution. By further assuming that the winds at two altitudes are quadravariate normally distributed, then the vector wind shear is bivariate normally distributed and the modulus of the vector wind shear is Rayleigh distributed. The conditional probability of wind component shears given a wind component is normally distributed. Examples of these and other properties of the multivariate normal probability distribution function as applied to Cape Kennedy, Florida, and Vandenberg AFB, California, wind data samples are given. A technique to develop a synthetic vector wind profile model of interest to aerospace vehicle applications is presented.

  7. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  8. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    PubMed

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  9. A quadrature based method of moments for nonlinear Fokker-Planck equations

    NASA Astrophysics Data System (ADS)

    Otten, Dustin L.; Vedula, Prakash

    2011-09-01

    Fokker-Planck equations which are nonlinear with respect to their probability densities and occur in many nonequilibrium systems relevant to mean field interaction models, plasmas, fermions and bosons can be challenging to solve numerically. To address some underlying challenges, we propose the application of the direct quadrature based method of moments (DQMOM) for efficient and accurate determination of transient (and stationary) solutions of nonlinear Fokker-Planck equations (NLFPEs). In DQMOM, probability density (or other distribution) functions are represented using a finite collection of Dirac delta functions, characterized by quadrature weights and locations (or abscissas) that are determined based on constraints due to evolution of generalized moments. Three particular examples of nonlinear Fokker-Planck equations considered in this paper include descriptions of: (i) the Shimizu-Yamada model, (ii) the Desai-Zwanzig model (both of which have been developed as models of muscular contraction) and (iii) fermions and bosons. Results based on DQMOM, for the transient and stationary solutions of the nonlinear Fokker-Planck equations, have been found to be in good agreement with other available analytical and numerical approaches. It is also shown that approximate reconstruction of the underlying probability density function from moments obtained from DQMOM can be satisfactorily achieved using a maximum entropy method.

  10. Survival of Norway spruce remains higher in mixed stands under a dryer and warmer climate.

    PubMed

    Neuner, Susanne; Albrecht, Axel; Cullmann, Dominik; Engels, Friedrich; Griess, Verena C; Hahn, W Andreas; Hanewinkel, Marc; Härtl, Fabian; Kölling, Christian; Staupendahl, Kai; Knoke, Thomas

    2015-02-01

    Shifts in tree species distributions caused by climatic change are expected to cause severe losses in the economic value of European forestland. However, this projection disregards potential adaptation options such as tree species conversion, shorter production periods, or establishment of mixed species forests. The effect of tree species mixture has, as yet, not been quantitatively investigated for its potential to mitigate future increases in production risks. For the first time, we use survival time analysis to assess the effects of climate, species mixture and soil condition on survival probabilities for Norway spruce and European beech. Accelerated Failure Time (AFT) models based on an extensive dataset of almost 65,000 trees from the European Forest Damage Survey (FDS)--part of the European-wide Level I monitoring network--predicted a 24% decrease in survival probability for Norway spruce in pure stands at age 120 when unfavorable changes in climate conditions were assumed. Increasing species admixture greatly reduced the negative effects of unfavorable climate conditions, resulting in a decline in survival probabilities of only 7%. We conclude that future studies of forest management under climate change as well as forest policy measures need to take this, as yet unconsidered, strongly advantageous effect of tree species mixture into account. © 2014 John Wiley & Sons Ltd.

  11. Surveillance and Datalink Communication Performance Analysis for Distributed Separation Assurance System Architectures

    NASA Technical Reports Server (NTRS)

    Chung, William W.; Linse, Dennis J.; Alaverdi, Omeed; Ifarraguerri, Carlos; Seifert, Scott C.; Salvano, Dan; Calender, Dale

    2012-01-01

    This study investigates the effects of two technical enablers: Automatic Dependent Surveillance - Broadcast (ADS-B) and digital datalink communication, of the Federal Aviation Administration s Next Generation Air Transportation System (NextGen) under two separation assurance (SA) system architectures: ground-based SA and airborne SA, on overall separation assurance performance. Datalink performance such as successful reception probability in both surveillance and communication messages, and surveillance accuracy are examined in various operational conditions. Required SA performance is evaluated as a function of subsystem performance, using availability, continuity, and integrity metrics to establish overall required separation assurance performance, under normal and off-nominal conditions.

  12. Quantum Common Causes and Quantum Causal Models

    NASA Astrophysics Data System (ADS)

    Allen, John-Mark A.; Barrett, Jonathan; Horsman, Dominic C.; Lee, Ciarán M.; Spekkens, Robert W.

    2017-07-01

    Reichenbach's principle asserts that if two observed variables are found to be correlated, then there should be a causal explanation of these correlations. Furthermore, if the explanation is in terms of a common cause, then the conditional probability distribution over the variables given the complete common cause should factorize. The principle is generalized by the formalism of causal models, in which the causal relationships among variables constrain the form of their joint probability distribution. In the quantum case, however, the observed correlations in Bell experiments cannot be explained in the manner Reichenbach's principle would seem to demand. Motivated by this, we introduce a quantum counterpart to the principle. We demonstrate that under the assumption that quantum dynamics is fundamentally unitary, if a quantum channel with input A and outputs B and C is compatible with A being a complete common cause of B and C , then it must factorize in a particular way. Finally, we show how to generalize our quantum version of Reichenbach's principle to a formalism for quantum causal models and provide examples of how the formalism works.

  13. A Measure Approximation for Distributionally Robust PDE-Constrained Optimization Problems

    DOE PAGES

    Kouri, Drew Philip

    2017-12-19

    In numerous applications, scientists and engineers acquire varied forms of data that partially characterize the inputs to an underlying physical system. This data is then used to inform decisions such as controls and designs. Consequently, it is critical that the resulting control or design is robust to the inherent uncertainties associated with the unknown probabilistic characterization of the model inputs. Here in this work, we consider optimal control and design problems constrained by partial differential equations with uncertain inputs. We do not assume a known probabilistic model for the inputs, but rather we formulate the problem as a distributionally robustmore » optimization problem where the outer minimization problem determines the control or design, while the inner maximization problem determines the worst-case probability measure that matches desired characteristics of the data. We analyze the inner maximization problem in the space of measures and introduce a novel measure approximation technique, based on the approximation of continuous functions, to discretize the unknown probability measure. Finally, we prove consistency of our approximated min-max problem and conclude with numerical results.« less

  14. Determination of the Changes of Drought Occurrence in Turkey Using Regional Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sibel Saygili, Fatma; Tufan Turp, M.; Kurnaz, M. Levent

    2017-04-01

    As a consequence of the negative impacts of climate change, Turkey, being a country in the Mediterranean Basin, is under a serious risk of increased drought conditions. In this study, it is aimed to determine and compare the spatial distributions of climatological drought probabilities for Turkey. For this purpose, by making use of Regional Climate Model (RegCM4.4) of the Abdus Salam International Centre for Theoretical Physics (ICTP), the outputs of the MPI-ESM-MR global climate model of the Max Planck Institute for Meteorology are downscaled to 50km for Turkey. To make the future projection over Turkey for the period of 2071-2100 with respect to the reference period of 1986-2005, the worst case emission pathway RCP8.5 is used. The Palmer Drought Severity Index (PDSI) values are computed and classified in accordance with the seven classifications of National Oceanic and Atmospheric Administration (NOAA). Finally, the spatial distribution maps showing the changes in drought probabilities over Turkey are obtained in order to see the impact of climate change on Turkey's drought patterns.

  15. Effect of bow-type initial imperfection on reliability of minimum-weight, stiffened structural panels

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Krishnamurthy, Thiagaraja; Sykes, Nancy P.; Elishakoff, Isaac

    1993-01-01

    Computations were performed to determine the effect of an overall bow-type imperfection on the reliability of structural panels under combined compression and shear loadings. A panel's reliability is the probability that it will perform the intended function - in this case, carry a given load without buckling or exceeding in-plane strain allowables. For a panel loaded in compression, a small initial bow can cause large bending stresses that reduce both the buckling load and the load at which strain allowables are exceeded; hence, the bow reduces the reliability of the panel. In this report, analytical studies on two stiffened panels quantified that effect. The bow is in the shape of a half-sine wave along the length of the panel. The size e of the bow at panel midlength is taken to be the single random variable. Several probability density distributions for e are examined to determine the sensitivity of the reliability to details of the bow statistics. In addition, the effects of quality control are explored with truncated distributions.

  16. Ecological and evolutionary processes at expanding range margins.

    PubMed

    Thomas, C D; Bodsworth, E J; Wilson, R J; Simmons, A D; Davies, Z G; Musche, M; Conradt, L

    2001-05-31

    Many animals are regarded as relatively sedentary and specialized in marginal parts of their geographical distributions. They are expected to be slow at colonizing new habitats. Despite this, the cool margins of many species' distributions have expanded rapidly in association with recent climate warming. We examined four insect species that have expanded their geographical ranges in Britain over the past 20 years. Here we report that two butterfly species have increased the variety of habitat types that they can colonize, and that two bush cricket species show increased fractions of longer-winged (dispersive) individuals in recently founded populations. Both ecological and evolutionary processes are probably responsible for these changes. Increased habitat breadth and dispersal tendencies have resulted in about 3- to 15-fold increases in expansion rates, allowing these insects to cross habitat disjunctions that would have represented major or complete barriers to dispersal before the expansions started. The emergence of dispersive phenotypes will increase the speed at which species invade new environments, and probably underlies the responses of many species to both past and future climate change.

  17. Sources and distribution of aromatic hydrocarbons in a tropical marine protected area estuary under influence of sugarcane cultivation.

    PubMed

    Arruda-Santos, Roxanny Helen de; Schettini, Carlos Augusto França; Yogui, Gilvan Takeshi; Maciel, Daniele Claudino; Zanardi-Lamardo, Eliete

    2018-05-15

    Goiana estuary is a well preserved marine protected area (MPA) located on the northeastern coast of Brazil. Despite its current state, human activities in the watershed represent a potential threat to long term local preservation. Dissolved/dispersed aromatic hydrocarbons and polycyclic aromatic hydrocarbons (PAHs) were investigated in water and sediments across the estuarine salt gradient. Concentration of aromatic hydrocarbons was low in all samples. According to results, aromatic hydrocarbons are associated to suspended particulate matter (SPM) carried to the estuary by river waters. An estuarine turbidity maximum (ETM) was identified in the upper estuary, indicating that both sediments and contaminants are trapped prior to an occasional export to the adjacent sea. PAHs distribution in sediments were associated with organic matter and mud content. Diagnostic ratios indicated pyrolytic processes as the main local source of PAHs that are probably associated with sugarcane burning and combustion engines. Low PAH concentrations probably do not cause adverse biological effects to the local biota although their presence indicate anthropogenic contamination and pressure on the Goiana estuary MPA. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. An analytical model for regular respiratory signals derived from the probability density function of Rayleigh distribution.

    PubMed

    Li, Xin; Li, Ye

    2015-01-01

    Regular respiratory signals (RRSs) acquired with physiological sensing systems (e.g., the life-detection radar system) can be used to locate survivors trapped in debris in disaster rescue, or predict the breathing motion to allow beam delivery under free breathing conditions in external beam radiotherapy. Among the existing analytical models for RRSs, the harmonic-based random model (HRM) is shown to be the most accurate, which, however, is found to be subject to considerable error if the RRS has a slowly descending end-of-exhale (EOE) phase. The defect of the HRM motivates us to construct a more accurate analytical model for the RRS. In this paper, we derive a new analytical RRS model from the probability density function of Rayleigh distribution. We evaluate the derived RRS model by using it to fit a real-life RRS in the sense of least squares, and the evaluation result shows that, our presented model exhibits lower error and fits the slowly descending EOE phases of the real-life RRS better than the HRM.

  19. Nonparametric density estimation and optimal bandwidth selection for protein unfolding and unbinding data

    NASA Astrophysics Data System (ADS)

    Bura, E.; Zhmurov, A.; Barsegov, V.

    2009-01-01

    Dynamic force spectroscopy and steered molecular simulations have become powerful tools for analyzing the mechanical properties of proteins, and the strength of protein-protein complexes and aggregates. Probability density functions of the unfolding forces and unfolding times for proteins, and rupture forces and bond lifetimes for protein-protein complexes allow quantification of the forced unfolding and unbinding transitions, and mapping the biomolecular free energy landscape. The inference of the unknown probability distribution functions from the experimental and simulated forced unfolding and unbinding data, as well as the assessment of analytically tractable models of the protein unfolding and unbinding requires the use of a bandwidth. The choice of this quantity is typically subjective as it draws heavily on the investigator's intuition and past experience. We describe several approaches for selecting the "optimal bandwidth" for nonparametric density estimators, such as the traditionally used histogram and the more advanced kernel density estimators. The performance of these methods is tested on unimodal and multimodal skewed, long-tailed distributed data, as typically observed in force spectroscopy experiments and in molecular pulling simulations. The results of these studies can serve as a guideline for selecting the optimal bandwidth to resolve the underlying distributions from the forced unfolding and unbinding data for proteins.

  20. The impact of individual-level heterogeneity on estimated infectious disease burden: a simulation study.

    PubMed

    McDonald, Scott A; Devleesschauwer, Brecht; Wallinga, Jacco

    2016-12-08

    Disease burden is not evenly distributed within a population; this uneven distribution can be due to individual heterogeneity in progression rates between disease stages. Composite measures of disease burden that are based on disease progression models, such as the disability-adjusted life year (DALY), are widely used to quantify the current and future burden of infectious diseases. Our goal was to investigate to what extent ignoring the presence of heterogeneity could bias DALY computation. Simulations using individual-based models for hypothetical infectious diseases with short and long natural histories were run assuming either "population-averaged" progression probabilities between disease stages, or progression probabilities that were influenced by an a priori defined individual-level frailty (i.e., heterogeneity in disease risk) distribution, and DALYs were calculated. Under the assumption of heterogeneity in transition rates and increasing frailty with age, the short natural history disease model predicted 14% fewer DALYs compared with the homogenous population assumption. Simulations of a long natural history disease indicated that assuming homogeneity in transition rates when heterogeneity was present could overestimate total DALYs, in the present case by 4% (95% quantile interval: 1-8%). The consequences of ignoring population heterogeneity should be considered when defining transition parameters for natural history models and when interpreting the resulting disease burden estimates.

  1. Climatology and variability of SST frontal activity in Eastern Pacific Ocean over the past decade

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Yuan, Y.

    2016-12-01

    Distribution of sea surface temperature (SST) fronts are derived from high-resolution MODIS dataset in Eastern Pacific Ocean from 2003 to 2015. Daily distribution of frontal activities shows detailed feature and movement of front and the discontinuity of the track of front cause by cloud coverage. Monthly frontal probability is calculated to investigate corresponding climatology and variability. Frontal probability is generally higher along the coast and decreasing offshore. The frontal activity could extend few hundreds of kilometers near the major capes and central Pacific Ocean. SST gradient associated with front is changing over different latitude with stronger gradient near the mid-latitude and under major topographic effects near tropics. Corresponding results from empirical orthogonal functions (EOF) shows major variability of SST front is found in mid-latitude and central Pacific Ocean. The temporal variability captures a strong interannual and annual variability in those regions, while Intraannual variability are found more important at small scale near major capes and topographic features. The frontal variability is highly impacted by wind stress, upwelling, air-sea interaction, current, topography, eddy activity, El Nino along with other factors. And front plays an importance role in influencing the distribution of nutrients, the activity of fisheries and the development of ecosystems.

  2. Direct test of the Gaussian auxiliary field ansatz in nonconserved order parameter phase ordering dynamics

    NASA Astrophysics Data System (ADS)

    Yeung, Chuck

    2018-06-01

    The assumption that the local order parameter is related to an underlying spatially smooth auxiliary field, u (r ⃗,t ) , is a common feature in theoretical approaches to non-conserved order parameter phase separation dynamics. In particular, the ansatz that u (r ⃗,t ) is a Gaussian random field leads to predictions for the decay of the autocorrelation function which are consistent with observations, but distinct from predictions using alternative theoretical approaches. In this paper, the auxiliary field is obtained directly from simulations of the time-dependent Ginzburg-Landau equation in two and three dimensions. The results show that u (r ⃗,t ) is equivalent to the distance to the nearest interface. In two dimensions, the probability distribution, P (u ) , is well approximated as Gaussian except for small values of u /L (t ) , where L (t ) is the characteristic length-scale of the patterns. The behavior of P (u ) in three dimensions is more complicated; the non-Gaussian region for small u /L (t ) is much larger than that in two dimensions but the tails of P (u ) begin to approach a Gaussian form at intermediate times. However, at later times, the tails of the probability distribution appear to decay faster than a Gaussian distribution.

  3. Weak gravitational lensing effects on the determination of Omega_mega_m and Omega_mega Lambda from SNeIa

    NASA Astrophysics Data System (ADS)

    Valageas, P.

    2000-02-01

    In this article we present an analytical calculation of the probability distribution of the magnification of distant sources due to weak gravitational lensing from non-linear scales. We use a realistic description of the non-linear density field, which has already been compared with numerical simulations of structure formation within hierarchical scenarios. Then, we can directly express the probability distribution P(mu ) of the magnification in terms of the probability distribution of the density contrast realized on non-linear scales (typical of galaxies) where the local slope of the initial linear power-spectrum is n=-2. We recover the behaviour seen by numerical simulations: P(mu ) peaks at a value slightly smaller than the mean < mu >=1 and it shows an extended large mu tail (as described in another article our predictions also show a good quantitative agreement with results from N-body simulations for a finite smoothing angle). Then, we study the effects of weak lensing on the derivation of the cosmological parameters from SNeIa. We show that the inaccuracy introduced by weak lensing is not negligible: {cal D}lta Omega_mega_m >~ 0.3 for two observations at z_s=0.5 and z_s=1. However, observations can unambiguously discriminate between Omega_mega_m =0.3 and Omega_mega_m =1. Moreover, in the case of a low-density universe one can clearly distinguish an open model from a flat cosmology (besides, the error decreases as the number of observ ed SNeIa increases). Since distant sources are more likely to be ``demagnified'' the most probable value of the observed density parameter Omega_mega_m is slightly smaller than its actual value. On the other hand, one may obtain some valuable information on the properties of the underlying non-linear density field from the measure of weak lensing distortions.

  4. Mean apparent propagator (MAP) MRI: a novel diffusion imaging method for mapping tissue microstructure.

    PubMed

    Özarslan, Evren; Koay, Cheng Guan; Shepherd, Timothy M; Komlosh, Michal E; İrfanoğlu, M Okan; Pierpaoli, Carlo; Basser, Peter J

    2013-09-01

    Diffusion-weighted magnetic resonance (MR) signals reflect information about underlying tissue microstructure and cytoarchitecture. We propose a quantitative, efficient, and robust mathematical and physical framework for representing diffusion-weighted MR imaging (MRI) data obtained in "q-space," and the corresponding "mean apparent propagator (MAP)" describing molecular displacements in "r-space." We also define and map novel quantitative descriptors of diffusion that can be computed robustly using this MAP-MRI framework. We describe efficient analytical representation of the three-dimensional q-space MR signal in a series expansion of basis functions that accurately describes diffusion in many complex geometries. The lowest order term in this expansion contains a diffusion tensor that characterizes the Gaussian displacement distribution, equivalent to diffusion tensor MRI (DTI). Inclusion of higher order terms enables the reconstruction of the true average propagator whose projection onto the unit "displacement" sphere provides an orientational distribution function (ODF) that contains only the orientational dependence of the diffusion process. The representation characterizes novel features of diffusion anisotropy and the non-Gaussian character of the three-dimensional diffusion process. Other important measures this representation provides include the return-to-the-origin probability (RTOP), and its variants for diffusion in one- and two-dimensions-the return-to-the-plane probability (RTPP), and the return-to-the-axis probability (RTAP), respectively. These zero net displacement probabilities measure the mean compartment (pore) volume and cross-sectional area in distributions of isolated pores irrespective of the pore shape. MAP-MRI represents a new comprehensive framework to model the three-dimensional q-space signal and transform it into diffusion propagators. Experiments on an excised marmoset brain specimen demonstrate that MAP-MRI provides several novel, quantifiable parameters that capture previously obscured intrinsic features of nervous tissue microstructure. This should prove helpful for investigating the functional organization of normal and pathologic nervous tissue. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Delay Analysis and Optimization of Bandwidth Request under Unicast Polling in IEEE 802.16e over Gilbert-Elliot Error Channel

    NASA Astrophysics Data System (ADS)

    Hwang, Eunju; Kim, Kyung Jae; Roijers, Frank; Choi, Bong Dae

    In the centralized polling mode in IEEE 802.16e, a base station (BS) polls mobile stations (MSs) for bandwidth reservation in one of three polling modes; unicast, multicast, or broadcast pollings. In unicast polling, the BS polls each individual MS to allow to transmit a bandwidth request packet. This paper presents an analytical model for the unicast polling of bandwidth request in IEEE 802.16e networks over Gilbert-Elliot error channel. We derive the probability distribution for the delay of bandwidth requests due to wireless transmission errors and find the loss probability of request packets due to finite retransmission attempts. By using the delay distribution and the loss probability, we optimize the number of polling slots within a frame and the maximum retransmission number while satisfying QoS on the total loss probability which combines two losses: packet loss due to the excess of maximum retransmission and delay outage loss due to the maximum tolerable delay bound. In addition, we obtain the utilization of polling slots, which is defined as the ratio of the number of polling slots used for the MS's successful transmission to the total number of polling slots used by the MS over a long run time. Analysis results are shown to well match with simulation results. Numerical results give examples of the optimal number of polling slots within a frame and the optimal maximum retransmission number depending on delay bounds, the number of MSs, and the channel conditions.

  6. Modelling the economic impact of three lameness causing diseases using herd and cow level evidence.

    PubMed

    Ettema, Jehan; Østergaard, Søren; Kristensen, Anders Ringgaard

    2010-06-01

    Diseases to the cow's hoof, interdigital skin and legs are highly prevalent and of large economic impact in modern dairy farming. In order to support farmer's decisions on preventing and treating lameness and its underlying causes, decision support models can be used to predict the economic profitability of such actions. An existing approach of modelling lameness as one health disorder in a dynamic, stochastic and mechanistic simulation model has been improved in two ways. First of all, three underlying diseases causing lameness were modelled: digital dermatitis, interdigital hyperplasia and claw horn diseases. Secondly, the existing simulation model was set-up in way that it uses hyper-distributions describing diseases risk of the three lameness causing diseases. By combining information on herd level risk factors with prevalence of lameness or prevalence of underlying diseases among cows, marginal posterior probability distributions for disease prevalence in the specific herd are created in a Bayesian network. Random draws from these distributions are used by the simulation model to describe disease risk. Hereby field data on prevalence is used systematically and uncertainty around herd specific risk is represented. Besides the fact that estimated profitability of halving disease risk depended on the hyper-distributions used, the estimates differed for herds with different levels of diseases risk and reproductive efficiency. (c) 2010 Elsevier B.V. All rights reserved.

  7. Does Breast Cancer Drive the Building of Survival Probability Models among States? An Assessment of Goodness of Fit for Patient Data from SEER Registries

    PubMed

    Khan, Hafiz; Saxena, Anshul; Perisetti, Abhilash; Rafiq, Aamrin; Gabbidon, Kemesha; Mende, Sarah; Lyuksyutova, Maria; Quesada, Kandi; Blakely, Summre; Torres, Tiffany; Afesse, Mahlet

    2016-12-01

    Background: Breast cancer is a worldwide public health concern and is the most prevalent type of cancer in women in the United States. This study concerned the best fit of statistical probability models on the basis of survival times for nine state cancer registries: California, Connecticut, Georgia, Hawaii, Iowa, Michigan, New Mexico, Utah, and Washington. Materials and Methods: A probability random sampling method was applied to select and extract records of 2,000 breast cancer patients from the Surveillance Epidemiology and End Results (SEER) database for each of the nine state cancer registries used in this study. EasyFit software was utilized to identify the best probability models by using goodness of fit tests, and to estimate parameters for various statistical probability distributions that fit survival data. Results: Statistical analysis for the summary of statistics is reported for each of the states for the years 1973 to 2012. Kolmogorov-Smirnov, Anderson-Darling, and Chi-squared goodness of fit test values were used for survival data, the highest values of goodness of fit statistics being considered indicative of the best fit survival model for each state. Conclusions: It was found that California, Connecticut, Georgia, Iowa, New Mexico, and Washington followed the Burr probability distribution, while the Dagum probability distribution gave the best fit for Michigan and Utah, and Hawaii followed the Gamma probability distribution. These findings highlight differences between states through selected sociodemographic variables and also demonstrate probability modeling differences in breast cancer survival times. The results of this study can be used to guide healthcare providers and researchers for further investigations into social and environmental factors in order to reduce the occurrence of and mortality due to breast cancer. Creative Commons Attribution License

  8. Role of the site of synaptic competition and the balance of learning forces for Hebbian encoding of probabilistic Markov sequences

    PubMed Central

    Bouchard, Kristofer E.; Ganguli, Surya; Brainard, Michael S.

    2015-01-01

    The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions. PMID:26257637

  9. The probability of misassociation between neighboring targets

    NASA Astrophysics Data System (ADS)

    Areta, Javier A.; Bar-Shalom, Yaakov; Rothrock, Ronald

    2008-04-01

    This paper presents procedures to calculate the probability that the measurement originating from an extraneous target will be (mis)associated with a target of interest for the cases of Nearest Neighbor and Global association. It is shown that these misassociation probabilities depend, under certain assumptions, on a particular - covariance weighted - norm of the difference between the targets' predicted measurements. For the Nearest Neighbor association, the exact solution, obtained for the case of equal innovation covariances, is based on a noncentral chi-square distribution. An approximate solution is also presented for the case of unequal innovation covariances. For the Global case an approximation is presented for the case of "similar" innovation covariances. In the general case of unequal innovation covariances where this approximation fails, an exact method based on the inversion of the characteristic function is presented. The theoretical results, confirmed by Monte Carlo simulations, quantify the benefit of Global vs. Nearest Neighbor association. These results are applied to problems of single sensor as well as centralized fusion architecture multiple sensor tracking.

  10. Uncertainty in estimates of the number of extraterrestrial civilizations

    NASA Technical Reports Server (NTRS)

    Sturrock, P. A.

    1980-01-01

    An estimation of the number N of communicative civilizations is made by means of Drake's formula which involves the combination of several quantities, each of which is to some extent uncertain. It is shown that the uncertainty in any quantity may be represented by a probability distribution function, even if that quantity is itself a probability. The uncertainty of current estimates of N is derived principally from uncertainty in estimates of the lifetime of advanced civilizations. It is argued that this is due primarily to uncertainty concerning the existence of a Galactic Federation which is in turn contingent upon uncertainty about whether the limitations of present-day physics are absolute or (in the event that there exists a yet undiscovered hyperphysics) transient. It is further argued that it is advantageous to consider explicitly these underlying assumptions in order to compare the probable numbers of civilizations operating radio beacons, permitting radio leakage, dispatching probes for radio surveillance for dispatching vehicles for manned surveillance.

  11. Are there common mathematical structures in economics and physics?

    NASA Astrophysics Data System (ADS)

    Mimkes, Jürgen

    2016-12-01

    Economics is a field that looks into the future. We may know a few things ahead (ex ante), but most things we only know, afterwards (ex post). How can we work in a field, where much of the important information is missing? Mathematics gives two answers: 1. Probability theory leads to microeconomics: the Lagrange function optimizes utility under constraints of economic terms (like costs). The utility function is the entropy, the logarithm of probability. The optimal result is given by a probability distribution and an integrating factor. 2. Calculus leads to macroeconomics: In economics we have two production factors, capital and labour. This requires two dimensional calculus with exact and not-exact differentials, which represent the "ex ante" and "ex post" terms of economics. An integrating factor turns a not-exact term (like income) into an exact term (entropy, the natural production function). The integrating factor is the same as in microeconomics and turns the not-exact field of economics into an exact physical science.

  12. Infinite capacity multi-server queue with second optional service channel

    NASA Astrophysics Data System (ADS)

    Ke, Jau-Chuan; Wu, Chia-Huang; Pearn, Wen Lea

    2013-02-01

    This paper deals with an infinite-capacity multi-server queueing system with a second optional service (SOS) channel. The inter-arrival times of arriving customers, the service times of the first essential service (FES) and the SOS channel are all exponentially distributed. A customer may leave the system after the FES channel with probability (1-θ), or at the completion of the FES may immediately require a SOS with probability θ (0 <= θ <= 1). The formulae for computing the rate matrix and stationary probabilities are derived by means of a matrix analytical approach. A cost model is developed to determine the optimal values of the number of servers and the two service rates, simultaneously, at the minimal total expected cost per unit time. Quasi-Newton method are employed to deal with the optimization problem. Under optimal operating conditions, numerical results are provided in which several system performance measures are calculated based on assumed numerical values of the system parameters.

  13. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    PubMed

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  14. A method to deconvolve stellar rotational velocities II. The probability distribution function via Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Christen, Alejandra; Escarate, Pedro; Curé, Michel; Rial, Diego F.; Cassetti, Julia

    2016-10-01

    Aims: Knowing the distribution of stellar rotational velocities is essential for understanding stellar evolution. Because we measure the projected rotational speed v sin I, we need to solve an ill-posed problem given by a Fredholm integral of the first kind to recover the "true" rotational velocity distribution. Methods: After discretization of the Fredholm integral we apply the Tikhonov regularization method to obtain directly the probability distribution function for stellar rotational velocities. We propose a simple and straightforward procedure to determine the Tikhonov parameter. We applied Monte Carlo simulations to prove that the Tikhonov method is a consistent estimator and asymptotically unbiased. Results: This method is applied to a sample of cluster stars. We obtain confidence intervals using a bootstrap method. Our results are in close agreement with those obtained using the Lucy method for recovering the probability density distribution of rotational velocities. Furthermore, Lucy estimation lies inside our confidence interval. Conclusions: Tikhonov regularization is a highly robust method that deconvolves the rotational velocity probability density function from a sample of v sin I data directly without the need for any convergence criteria.

  15. Nuclear Forensics Analysis with Missing and Uncertain Data

    DOE PAGES

    Langan, Roisin T.; Archibald, Richard K.; Lamberti, Vincent

    2015-10-05

    We have applied a new imputation-based method for analyzing incomplete data, called Monte Carlo Bayesian Database Generation (MCBDG), to the Spent Fuel Isotopic Composition (SFCOMPO) database. About 60% of the entries are absent for SFCOMPO. The method estimates missing values of a property from a probability distribution created from the existing data for the property, and then generates multiple instances of the completed database for training a machine learning algorithm. Uncertainty in the data is represented by an empirical or an assumed error distribution. The method makes few assumptions about the underlying data, and compares favorably against results obtained bymore » replacing missing information with constant values.« less

  16. Performance analysis of dual-hop optical wireless communication systems over k-distribution turbulence channel with pointing error

    NASA Astrophysics Data System (ADS)

    Mishra, Neha; Sriram Kumar, D.; Jha, Pranav Kumar

    2017-06-01

    In this paper, we investigate the performance of the dual-hop free space optical (FSO) communication systems under the effect of strong atmospheric turbulence together with misalignment effects (pointing error). We consider a relay assisted link using decode and forward (DF) relaying protocol between source and destination with the assumption that Channel State Information is available at both transmitting and receiving terminals. The atmospheric turbulence channels are modeled by k-distribution with pointing error impairment. The exact closed form expression is derived for outage probability and bit error rate and illustrated through numerical plots. Further BER results are compared for the different modulation schemes.

  17. The investigation of the lateral interaction effect's on traffic flow behavior under open boundaries

    NASA Astrophysics Data System (ADS)

    Bouadi, M.; Jetto, K.; Benyoussef, A.; El Kenz, A.

    2017-11-01

    In this paper, an open boundaries traffic flow system is studied by taking into account the lateral interaction with spatial defects. For a random defects distribution, if the vehicles velocities are weakly correlated, the traffic phases can be predicted by considering the corresponding inflow and outflow functions. Conversely, if the vehicles velocities are strongly correlated, a phase segregation appears inside the system's bulk which induces the maximum current appearance. Such velocity correlation depends mainly on the defects densities and the probabilities of lateral deceleration. However, for a compact defects distribution, the traffic phases are predictable by using the inflow in the system beginning, the inflow entering the defects zone and the outflow function.

  18. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  19. Brook trout distributional response to unconventional oil and gas development: Landscape context matters

    USGS Publications Warehouse

    Merriam, Eric R.; Petty, J. Todd; Maloney, Kelly O.; Young, John A.; Faulkner, Stephen; Slonecker, Terry; Milheim, Lesley E.; Hailegiorgis, Atesmachew; Niles, Jonathan M.

    2018-01-01

    We conducted a large-scale assessment of unconventional oil and gas (UOG) development effects on brook trout (Salvelinus fontinalis) distribution. We compiled 2231 brook trout collection records from the Upper Susquehanna River Watershed, USA. We used boosted regression tree (BRT) analysis to predict occurrence probability at the 1:24,000 stream-segment scale as a function of natural and anthropogenic landscape and climatic attributes. We then evaluated the importance of landscape context (i.e., pre-existing natural habitat quality and anthropogenic degradation) in modulating the effects of UOG on brook trout distribution under UOG development scenarios. BRT made use of 5 anthropogenic (28% relative influence) and 7 natural (72% relative influence) variables to model occurrence with a high degree of accuracy [Area Under the Receiver Operating Curve (AUC) = 0.85 and cross-validated AUC = 0.81]. UOG development impacted 11% (n = 2784) of streams and resulted in a loss of predicted occurrence in 126 (4%). Most streams impacted by UOG had unsuitable underlying natural habitat quality (n = 1220; 44%). Brook trout were predicted to be absent from an additional 26% (n = 733) of streams due to pre-existing non-UOG land uses (i.e., agriculture, residential and commercial development, or historic mining). Streams with a predicted and observed (via existing pre- and post-disturbance fish sampling records) loss of occurrence due to UOG tended to have intermediate natural habitat quality and/or intermediate levels of non-UOG stress. Simulated development of permitted but undeveloped UOG wells (n = 943) resulted in a loss of predicted occurrence in 27 additional streams. Loss of occurrence was strongly dependent upon landscape context, suggesting effects of current and future UOG development are likely most relevant in streams near the probability threshold due to pre-existing habitat degradation.

  20. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

Top