Sample records for cumulative probability functions

  1. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    PubMed

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  2. 40 CFR Appendix C to Part 191 - Guidance for Implementation of Subpart B

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that the remaining probability distribution of cumulative releases would not be significantly changed... with § 191.13 into a “complementary cumulative distribution function” that indicates the probability of... distribution function for each disposal system considered. The Agency assumes that a disposal system can be...

  3. 40 CFR Appendix C to Part 191 - Guidance for Implementation of Subpart B

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that the remaining probability distribution of cumulative releases would not be significantly changed... with § 191.13 into a “complementary cumulative distribution function” that indicates the probability of... distribution function for each disposal system considered. The Agency assumes that a disposal system can be...

  4. Resonances in the cumulative reaction probability for a model electronically nonadiabatic reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, J.; Bowman, J.M.

    1996-05-01

    The cumulative reaction probability, flux{endash}flux correlation function, and rate constant are calculated for a model, two-state, electronically nonadiabatic reaction, given by Shin and Light [S. Shin and J. C. Light, J. Chem. Phys. {bold 101}, 2836 (1994)]. We apply straightforward generalizations of the flux matrix/absorbing boundary condition approach of Miller and co-workers to obtain these quantities. The upper adiabatic electronic potential supports bound states, and these manifest themselves as {open_quote}{open_quote}recrossing{close_quote}{close_quote} resonances in the cumulative reaction probability, at total energies above the barrier to reaction on the lower adiabatic potential. At energies below the barrier, the cumulative reaction probability for themore » coupled system is shifted to higher energies relative to the one obtained for the ground state potential. This is due to the effect of an additional effective barrier caused by the nuclear kinetic operator acting on the ground state, adiabatic electronic wave function, as discussed earlier by Shin and Light. Calculations are reported for five sets of electronically nonadiabatic coupling parameters. {copyright} {ital 1996 American Institute of Physics.}« less

  5. Probability and Statistics in Sensor Performance Modeling

    DTIC Science & Technology

    2010-12-01

    language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of

  6. Exact probability distribution function for the volatility of cumulative production

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  7. On Interpreting and Extracting Information from the Cumulative Distribution Function Curve: A New Perspective with Applications

    ERIC Educational Resources Information Center

    Balasooriya, Uditha; Li, Jackie; Low, Chan Kee

    2012-01-01

    For any density function (or probability function), there always corresponds a "cumulative distribution function" (cdf). It is a well-known mathematical fact that the cdf is more general than the density function, in the sense that for a given distribution the former may exist without the existence of the latter. Nevertheless, while the…

  8. CUMPOIS- CUMULATIVE POISSON DISTRIBUTION PROGRAM

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The Cumulative Poisson distribution program, CUMPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), can be used independently of one another. CUMPOIS determines the approximate cumulative binomial distribution, evaluates the cumulative distribution function (cdf) for gamma distributions with integer shape parameters, and evaluates the cdf for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. CUMPOIS calculates the probability that n or less events (ie. cumulative) will occur within any unit when the expected number of events is given as lambda. Normally, this probability is calculated by a direct summation, from i=0 to n, of terms involving the exponential function, lambda, and inverse factorials. This approach, however, eventually fails due to underflow for sufficiently large values of n. Additionally, when the exponential term is moved outside of the summation for simplification purposes, there is a risk that the terms remaining within the summation, and the summation itself, will overflow for certain values of i and lambda. CUMPOIS eliminates these possibilities by multiplying an additional exponential factor into the summation terms and the partial sum whenever overflow/underflow situations threaten. The reciprocal of this term is then multiplied into the completed sum giving the cumulative probability. The CUMPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting lambda and n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMPOIS was developed in 1988.

  9. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Rourke, Patrick Francis

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  10. Cumulative Poisson Distribution Program

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  11. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  12. Goal-Oriented Probability Density Function Methods for Uncertainty Quantification

    DTIC Science & Technology

    2015-12-11

    approximations or data-driven approaches. We investigated the accuracy of analytical tech- niques based Kubo -Van Kampen operator cumulant expansions for...analytical techniques based Kubo -Van Kampen operator cumulant expansions for Langevin equations driven by fractional Brownian motion and other noises

  13. Simple gain probability functions for large reflector antennas of JPL/NASA

    NASA Technical Reports Server (NTRS)

    Jamnejad, V.

    2003-01-01

    Simple models for the patterns as well as their cumulative gain probability and probability density functions of the Deep Space Network antennas are developed. These are needed for the study and evaluation of interference from unwanted sources such as the emerging terrestrial system, High Density Fixed Service, with the Ka-band receiving antenna systems in Goldstone Station of the Deep Space Network.

  14. Decision making generalized by a cumulative probability weighting function

    NASA Astrophysics Data System (ADS)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  15. On the Use of a Cumulative Distribution as a Utility Function in Educational or Employment Selection.

    DTIC Science & Technology

    1981-02-01

    monotonic increasing function of true ability or performance score. A cumulative probability function is * then very convenient for describiny; one’s...possible outcomes such as test scores, grade-point averages or other common outcome variables. Utility is usually a monotonic increasing function of true ...r(0) is negative for 8 <i and positive for 0 > M, U(o) is risk-prone for low 0 values and risk-averse for high 0 values. This property is true for

  16. About the cumulants of periodic signals

    NASA Astrophysics Data System (ADS)

    Barrau, Axel; El Badaoui, Mohammed

    2018-01-01

    This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.

  17. Ordinal probability effect measures for group comparisons in multinomial cumulative link models.

    PubMed

    Agresti, Alan; Kateri, Maria

    2017-03-01

    We consider simple ordinal model-based probability effect measures for comparing distributions of two groups, adjusted for explanatory variables. An "ordinal superiority" measure summarizes the probability that an observation from one distribution falls above an independent observation from the other distribution, adjusted for explanatory variables in a model. The measure applies directly to normal linear models and to a normal latent variable model for ordinal response variables. It equals Φ(β/2) for the corresponding ordinal model that applies a probit link function to cumulative multinomial probabilities, for standard normal cdf Φ and effect β that is the coefficient of the group indicator variable. For the more general latent variable model for ordinal responses that corresponds to a linear model with other possible error distributions and corresponding link functions for cumulative multinomial probabilities, the ordinal superiority measure equals exp(β)/[1+exp(β)] with the log-log link and equals approximately exp(β/2)/[1+exp(β/2)] with the logit link, where β is the group effect. Another ordinal superiority measure generalizes the difference of proportions from binary to ordinal responses. We also present related measures directly for ordinal models for the observed response that need not assume corresponding latent response models. We present confidence intervals for the measures and illustrate with an example. © 2016, The International Biometric Society.

  18. fixedTimeEvents: An R package for the distribution of distances between discrete events in fixed time

    NASA Astrophysics Data System (ADS)

    Liland, Kristian Hovde; Snipen, Lars

    When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.

  19. Comparing the ISO-recommended and the cumulative data-reduction algorithms in S-on-1 laser damage test by a reverse approach method

    NASA Astrophysics Data System (ADS)

    Zorila, Alexandru; Stratan, Aurel; Nemes, George

    2018-01-01

    We compare the ISO-recommended (the standard) data-reduction algorithm used to determine the surface laser-induced damage threshold of optical materials by the S-on-1 test with two newly suggested algorithms, both named "cumulative" algorithms/methods, a regular one and a limit-case one, intended to perform in some respects better than the standard one. To avoid additional errors due to real experiments, a simulated test is performed, named the reverse approach. This approach simulates the real damage experiments, by generating artificial test-data of damaged and non-damaged sites, based on an assumed, known damage threshold fluence of the target and on a given probability distribution function to induce the damage. In this work, a database of 12 sets of test-data containing both damaged and non-damaged sites was generated by using four different reverse techniques and by assuming three specific damage probability distribution functions. The same value for the threshold fluence was assumed, and a Gaussian fluence distribution on each irradiated site was considered, as usual for the S-on-1 test. Each of the test-data was independently processed by the standard and by the two cumulative data-reduction algorithms, the resulting fitted probability distributions were compared with the initially assumed probability distribution functions, and the quantities used to compare these algorithms were determined. These quantities characterize the accuracy and the precision in determining the damage threshold and the goodness of fit of the damage probability curves. The results indicate that the accuracy in determining the absolute damage threshold is best for the ISO-recommended method, the precision is best for the limit-case of the cumulative method, and the goodness of fit estimator (adjusted R-squared) is almost the same for all three algorithms.

  20. Meta-analysis for aggregated survival data with competing risks: a parametric approach using cumulative incidence functions.

    PubMed

    Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido

    2016-09-01

    Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  1. A Seakeeping Performance and Affordability Tradeoff Study for the Coast Guard Offshore Patrol Cutter

    DTIC Science & Technology

    2016-06-01

    Index Polar Plot for Sea State 4, All Headings Are Relative to the Wave Motion and Velocity is Given in Meters per Second...40 Figure 15. Probability and Cumulative Density Functions of Annual Sea State Occurrences in the Open Ocean, North Pacific...criteria at a given sea state. Probability distribution functions are available that describe the likelihood that an operational area will experience

  2. Derivation of Hunt equation for suspension distribution using Shannon entropy theory

    NASA Astrophysics Data System (ADS)

    Kundu, Snehasis

    2017-12-01

    In this study, the Hunt equation for computing suspension concentration in sediment-laden flows is derived using Shannon entropy theory. Considering the inverse of the void ratio as a random variable and using principle of maximum entropy, probability density function and cumulative distribution function of suspension concentration is derived. A new and more general cumulative distribution function for the flow domain is proposed which includes several specific other models of CDF reported in literature. This general form of cumulative distribution function also helps to derive the Rouse equation. The entropy based approach helps to estimate model parameters using suspension data of sediment concentration which shows the advantage of using entropy theory. Finally model parameters in the entropy based model are also expressed as functions of the Rouse number to establish a link between the parameters of the deterministic and probabilistic approaches.

  3. On the optimal identification of tag sets in time-constrained RFID configurations.

    PubMed

    Vales-Alonso, Javier; Bueno-Delgado, María Victoria; Egea-López, Esteban; Alcaraz, Juan José; Pérez-Mañogil, Juan Manuel

    2011-01-01

    In Radio Frequency Identification facilities the identification delay of a set of tags is mainly caused by the random access nature of the reading protocol, yielding a random identification time of the set of tags. In this paper, the cumulative distribution function of the identification time is evaluated using a discrete time Markov chain for single-set time-constrained passive RFID systems, namely those ones where a single group of tags is assumed to be in the reading area and only for a bounded time (sojourn time) before leaving. In these scenarios some tags in a set may leave the reader coverage area unidentified. The probability of this event is obtained from the cumulative distribution function of the identification time as a function of the sojourn time. This result provides a suitable criterion to minimize the probability of losing tags. Besides, an identification strategy based on splitting the set of tags in smaller subsets is also considered. Results demonstrate that there are optimal splitting configurations that reduce the overall identification time while keeping the same probability of losing tags.

  4. Atmospheric Teleconnections From Cumulants

    NASA Astrophysics Data System (ADS)

    Sabou, F.; Kaspi, Y.; Marston, B.; Schneider, T.

    2011-12-01

    Multi-point cumulants of fields such as vorticity provide a way to visualize atmospheric teleconnections, complementing other approaches such as the method of empirical orthogonal functions (EOFs). We calculate equal-time two-point cumulants of the vorticity from NCEP reanalysis data during the period 1980 -- 2010 and from direct numerical simulation (DNS) using an idealized dry general circulation model (GCM) (Schneider and Walker, 2006). Extratropical correlations seen in the NCEP data are qualitatively reproduced by the model. Three- and four-point cumulants accumulated from DNS quantify departures of the probability distribution function from a normal distribution, shedding light on the efficacy of direct statistical simulation (DSS) of atmosphere dynamics by cumulant expansions (Marston, Conover, and Schneider, 2008; Marston 2011). Lagged-time two-point cumulants between temperature gradients and eddy kinetic energy (EKE), accumulated by DNS of an idealized moist aquaplanet GCM (O'Gorman and Schneider, 2008), reveal dynamics of storm tracks. Regions of enhanced baroclinicity (as found along the eastern boundary of continents) lead to a local enhancement of EKE and a suppression of EKE further downstream as the storm track self-destructs (Kaspi and Schneider, 2011).

  5. Development and application of an empirical probability distribution for the prediction error of re-entry body maximum dynamic pressure

    NASA Technical Reports Server (NTRS)

    Lanzi, R. James; Vincent, Brett T.

    1993-01-01

    The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.

  6. Cumulative detection probabilities and range accuracy of a pulsed Geiger-mode avalanche photodiode laser ranging system

    NASA Astrophysics Data System (ADS)

    Luo, Hanjun; Ouyang, Zhengbiao; Liu, Qiang; Chen, Zhiliang; Lu, Hualan

    2017-10-01

    Cumulative pulses detection with appropriate cumulative pulses number and threshold has the ability to improve the detection performance of the pulsed laser ranging system with GM-APD. In this paper, based on Poisson statistics and multi-pulses cumulative process, the cumulative detection probabilities and their influence factors are investigated. With the normalized probability distribution of each time bin, the theoretical model of the range accuracy and precision is established, and the factors limiting the range accuracy and precision are discussed. The results show that the cumulative pulses detection can produce higher target detection probability and lower false alarm probability. However, for a heavy noise level and extremely weak echo intensity, the false alarm suppression performance of the cumulative pulses detection deteriorates quickly. The range accuracy and precision is another important parameter evaluating the detection performance, the echo intensity and pulse width are main influence factors on the range accuracy and precision, and higher range accuracy and precision is acquired with stronger echo intensity and narrower echo pulse width, for 5-ns echo pulse width, when the echo intensity is larger than 10, the range accuracy and precision lower than 7.5 cm can be achieved.

  7. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Evidence of scaling of void probability in nucleus-nucleus interactions at few GeV energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Dipak; Biswas, Biswanath; Deb, Argha

    1997-11-01

    The rapidity gap probability in the {sup 24}Mg-AgBr interaction at 4.5GeV/c/nucleon has been studied in detail. The data reveal scaling behavior of the void probability in the central rapidity domain which confirms the validity of the linked-pair approximation for the N-particle cumulant correlation functions. This scaling behavior appears to be similar to the void probability in the Perseus-Pisces supercluster region of galaxies. {copyright} {ital 1997} {ital The American Physical Society}

  9. Decision analysis with cumulative prospect theory.

    PubMed

    Bayoumi, A M; Redelmeier, D A

    2000-01-01

    Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.

  10. Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization

    PubMed Central

    Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.

    2014-01-01

    Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406

  11. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  12. Large-deviation theory for diluted Wishart random matrices

    NASA Astrophysics Data System (ADS)

    Castillo, Isaac Pérez; Metz, Fernando L.

    2018-03-01

    Wishart random matrices with a sparse or diluted structure are ubiquitous in the processing of large datasets, with applications in physics, biology, and economy. In this work, we develop a theory for the eigenvalue fluctuations of diluted Wishart random matrices based on the replica approach of disordered systems. We derive an analytical expression for the cumulant generating function of the number of eigenvalues IN(x ) smaller than x ∈R+ , from which all cumulants of IN(x ) and the rate function Ψx(k ) controlling its large-deviation probability Prob[IN(x ) =k N ] ≍e-N Ψx(k ) follow. Explicit results for the mean value and the variance of IN(x ) , its rate function, and its third cumulant are discussed and thoroughly compared to numerical diagonalization, showing very good agreement. The present work establishes the theoretical framework put forward in a recent letter [Phys. Rev. Lett. 117, 104101 (2016), 10.1103/PhysRevLett.117.104101] as an exact and compelling approach to deal with eigenvalue fluctuations of sparse random matrices.

  13. Probability of failure prediction for step-stress fatigue under sine or random stress

    NASA Technical Reports Server (NTRS)

    Lambert, R. G.

    1979-01-01

    A previously proposed cumulative fatigue damage law is extended to predict the probability of failure or fatigue life for structural materials with S-N fatigue curves represented as a scatterband of failure points. The proposed law applies to structures subjected to sinusoidal or random stresses and includes the effect of initial crack (i.e., flaw) sizes. The corrected cycle ratio damage function is shown to have physical significance.

  14. Computing approximate random Delta v magnitude probability densities. [for spacecraft trajectory correction

    NASA Technical Reports Server (NTRS)

    Chadwick, C.

    1984-01-01

    This paper describes the development and use of an algorithm to compute approximate statistics of the magnitude of a single random trajectory correction maneuver (TCM) Delta v vector. The TCM Delta v vector is modeled as a three component Cartesian vector each of whose components is a random variable having a normal (Gaussian) distribution with zero mean and possibly unequal standard deviations. The algorithm uses these standard deviations as input to produce approximations to (1) the mean and standard deviation of the magnitude of Delta v, (2) points of the probability density function of the magnitude of Delta v, and (3) points of the cumulative and inverse cumulative distribution functions of Delta v. The approximates are based on Monte Carlo techniques developed in a previous paper by the author and extended here. The algorithm described is expected to be useful in both pre-flight planning and in-flight analysis of maneuver propellant requirements for space missions.

  15. A review of contemporary methods for the presentation of scientific uncertainty.

    PubMed

    Makinson, K A; Hamby, D M; Edwards, J A

    2012-12-01

    Graphic methods for displaying uncertainty are often the most concise and informative way to communicate abstract concepts. Presentation methods currently in use for the display and interpretation of scientific uncertainty are reviewed. Numerous subjective and objective uncertainty display methods are presented, including qualitative assessments, node and arrow diagrams, standard statistical methods, box-and-whisker plots,robustness and opportunity functions, contribution indexes, probability density functions, cumulative distribution functions, and graphical likelihood functions.

  16. Cumulative risk of false positive test in relation to breast symptoms in mammography screening: a historical prospective cohort study.

    PubMed

    Singh, Deependra; Pitkäniemi, Janne; Malila, Nea; Anttila, Ahti

    2016-09-01

    Mammography has been found effective as the primary screening test for breast cancer. We estimated the cumulative probability of false positive screening test results with respect to symptom history reported at screen. A historical prospective cohort study was done using individual screening data from 413,611 women aged 50-69 years with 2,627,256 invitations for mammography screening between 1992 and 2012 in Finland. Symptoms (lump, retraction, and secretion) were reported at 56,805 visits, and 48,873 visits resulted in a false positive mammography result. Generalized linear models were used to estimate the probability of at least one false positive test and true positive at screening visits. The estimates were compared among women with and without symptoms history. The estimated cumulative probabilities were 18 and 6 % for false positive and true positive results, respectively. In women with a history of a lump, the cumulative probabilities of false positive test and true positive were 45 and 16 %, respectively, compared to 17 and 5 % with no reported lump. In women with a history of any given symptom, the cumulative probabilities of false positive test and true positive were 38 and 13 %, respectively. Likewise, women with a history of a 'lump and retraction' had the cumulative false positive probability of 56 %. The study showed higher cumulative risk of false positive tests and more cancers detected in women who reported symptoms compared to women who did not report symptoms at screen. The risk varies substantially, depending on symptom types and characteristics. Information on breast symptoms influences the balance of absolute benefits and harms of screening.

  17. The Cumulative Probability of Arrest by Age 28 Years in the United States by Disability Status, Race/Ethnicity, and Gender.

    PubMed

    McCauley, Erin J

    2017-12-01

    To estimate the cumulative probability (c) of arrest by age 28 years in the United States by disability status, race/ethnicity, and gender. I estimated cumulative probabilities through birth cohort life tables with data from the National Longitudinal Survey of Youth, 1997. Estimates demonstrated that those with disabilities have a higher cumulative probability of arrest (c = 42.65) than those without (c = 29.68). The risk was disproportionately spread across races/ethnicities, with Blacks with disabilities experiencing the highest cumulative probability of arrest (c = 55.17) and Whites without disabilities experiencing the lowest (c = 27.55). Racial/ethnic differences existed by gender as well. There was a similar distribution of disability types across race/ethnicity, suggesting that the racial/ethnic differences in arrest may stem from racial/ethnic inequalities as opposed to differential distribution of disability types. The experience of arrest for those with disabilities was higher than expected. Police officers should understand how disabilities may affect compliance and other behaviors, and likewise how implicit bias and structural racism may affect reactions and actions of officers and the systems they work within in ways that create inequities.

  18. Detection of non-Gaussian fluctuations in a quantum point contact.

    PubMed

    Gershon, G; Bomze, Yu; Sukhorukov, E V; Reznikov, M

    2008-07-04

    An experimental study of current fluctuations through a tunable transmission barrier, a quantum point contact, is reported. We measure the probability distribution function of transmitted charge with precision sufficient to extract the first three cumulants. To obtain the intrinsic quantities, corresponding to voltage-biased barrier, we employ a procedure that accounts for the response of the external circuit and the amplifier. The third cumulant, obtained with a high precision, is found to agree with the prediction for the statistics of transport in the non-Poissonian regime.

  19. Detection of Non-Gaussian Fluctuations in a Quantum Point Contact

    NASA Astrophysics Data System (ADS)

    Gershon, G.; Bomze, Yu.; Sukhorukov, E. V.; Reznikov, M.

    2008-07-01

    An experimental study of current fluctuations through a tunable transmission barrier, a quantum point contact, is reported. We measure the probability distribution function of transmitted charge with precision sufficient to extract the first three cumulants. To obtain the intrinsic quantities, corresponding to voltage-biased barrier, we employ a procedure that accounts for the response of the external circuit and the amplifier. The third cumulant, obtained with a high precision, is found to agree with the prediction for the statistics of transport in the non-Poissonian regime.

  20. Functional aging in pilots : an examination of a mathematical model based on medical data on general aviation pilots.

    DOT National Transportation Integrated Search

    1982-06-01

    The purpose of this study was to apply mathematical procedures to the Federal Aviation Administration (FAA) pilot medical data to examine the feasibility of devising a linear numbering system such that (1) the cumulative probability distribution func...

  1. Estimation of the lower and upper bounds on the probability of failure using subset simulation and random set theory

    NASA Astrophysics Data System (ADS)

    Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.

    2018-02-01

    Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.

  2. Double inverse-weighted estimation of cumulative treatment effects under nonproportional hazards and dependent censoring.

    PubMed

    Schaubel, Douglas E; Wei, Guanghui

    2011-03-01

    In medical studies of time-to-event data, nonproportional hazards and dependent censoring are very common issues when estimating the treatment effect. A traditional method for dealing with time-dependent treatment effects is to model the time-dependence parametrically. Limitations of this approach include the difficulty to verify the correctness of the specified functional form and the fact that, in the presence of a treatment effect that varies over time, investigators are usually interested in the cumulative as opposed to instantaneous treatment effect. In many applications, censoring time is not independent of event time. Therefore, we propose methods for estimating the cumulative treatment effect in the presence of nonproportional hazards and dependent censoring. Three measures are proposed, including the ratio of cumulative hazards, relative risk, and difference in restricted mean lifetime. For each measure, we propose a double inverse-weighted estimator, constructed by first using inverse probability of treatment weighting (IPTW) to balance the treatment-specific covariate distributions, then using inverse probability of censoring weighting (IPCW) to overcome the dependent censoring. The proposed estimators are shown to be consistent and asymptotically normal. We study their finite-sample properties through simulation. The proposed methods are used to compare kidney wait-list mortality by race. © 2010, The International Biometric Society.

  3. Trivariate characteristics of intensity fluctuations for heavily saturated optical systems.

    PubMed

    Das, Biman; Drake, Eli; Jack, John

    2004-02-01

    Trivariate cumulants of intensity fluctuations have been computed starting from a trivariate intensity probability distribution function, which rests on the assumption that the variation of intensity has a maximum entropy distribution with the constraint that the total intensity is constant. The assumption holds for optical systems such as a thin, long, mirrorless gas laser amplifier where under heavy gain saturation the total output approaches a constant intensity, although intensity of any mode fluctuates rapidly over the average intensity. The relations between trivariate cumulants and central moments that were needed for the computation of trivariate cumulants were derived. The results of the computation show that the cumulants have characteristic values that depend on the number of interacting modes in the system. The cumulant values approach zero when the number of modes is infinite, as expected. The results will be useful for comparison with the experimental triavariate statistics of heavily saturated optical systems such as the output from a thin, long, bidirectional gas laser amplifier.

  4. Performance of mixed RF/FSO systems in exponentiated Weibull distributed channels

    NASA Astrophysics Data System (ADS)

    Zhao, Jing; Zhao, Shang-Hong; Zhao, Wei-Hu; Liu, Yun; Li, Xuan

    2017-12-01

    This paper presented the performances of asymmetric mixed radio frequency (RF)/free-space optical (FSO) system with the amplify-and-forward relaying scheme. The RF channel undergoes Nakagami- m channel, and the Exponentiated Weibull distribution is adopted for the FSO component. The mathematical formulas for cumulative distribution function (CDF), probability density function (PDF) and moment generating function (MGF) of equivalent signal-to-noise ratio (SNR) are achieved. According to the end-to-end statistical characteristics, the new analytical expressions of outage probability are obtained. Under various modulation techniques, we derive the average bit-error-rate (BER) based on the Meijer's G function. The evaluation and simulation are provided for the system performance, and the aperture average effect is discussed as well.

  5. NEWTPOIS- NEWTON POISSON DISTRIBUTION PROGRAM

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative poisson distribution program, NEWTPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714), can be used independently of one another. NEWTPOIS determines percentiles for gamma distributions with integer shape parameters and calculates percentiles for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. NEWTPOIS determines the Poisson parameter (lambda), that is; the mean (or expected) number of events occurring in a given unit of time, area, or space. Given that the user already knows the cumulative probability for a specific number of occurrences (n) it is usually a simple matter of substitution into the Poisson distribution summation to arrive at lambda. However, direct calculation of the Poisson parameter becomes difficult for small positive values of n and unmanageable for large values. NEWTPOIS uses Newton's iteration method to extract lambda from the initial value condition of the Poisson distribution where n=0, taking successive estimations until some user specified error term (epsilon) is reached. The NEWTPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting epsilon, n, and the cumulative probability of the occurrence of n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 30K. NEWTPOIS was developed in 1988.

  6. Probability of stress-corrosion fracture under random loading

    NASA Technical Reports Server (NTRS)

    Yang, J. N.

    1974-01-01

    Mathematical formulation is based on cumulative-damage hypothesis and experimentally-determined stress-corrosion characteristics. Under both stationary random loadings, mean value and variance of cumulative damage are obtained. Probability of stress-corrosion fracture is then evaluated, using principle of maximum entropy.

  7. Representation of analysis results involving aleatory and epistemic uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less

  8. Deep Learning Role in Early Diagnosis of Prostate Cancer

    PubMed Central

    Reda, Islam; Khalil, Ashraf; Elmogy, Mohammed; Abou El-Fetouh, Ahmed; Shalaby, Ahmed; Abou El-Ghar, Mohamed; Elmaghraby, Adel; Ghazal, Mohammed; El-Baz, Ayman

    2018-01-01

    The objective of this work is to develop a computer-aided diagnostic system for early diagnosis of prostate cancer. The presented system integrates both clinical biomarkers (prostate-specific antigen) and extracted features from diffusion-weighted magnetic resonance imaging collected at multiple b values. The presented system performs 3 major processing steps. First, prostate delineation using a hybrid approach that combines a level-set model with nonnegative matrix factorization. Second, estimation and normalization of diffusion parameters, which are the apparent diffusion coefficients of the delineated prostate volumes at different b values followed by refinement of those apparent diffusion coefficients using a generalized Gaussian Markov random field model. Then, construction of the cumulative distribution functions of the processed apparent diffusion coefficients at multiple b values. In parallel, a K-nearest neighbor classifier is employed to transform the prostate-specific antigen results into diagnostic probabilities. Finally, those prostate-specific antigen–based probabilities are integrated with the initial diagnostic probabilities obtained using stacked nonnegativity constraint sparse autoencoders that employ apparent diffusion coefficient–cumulative distribution functions for better diagnostic accuracy. Experiments conducted on 18 diffusion-weighted magnetic resonance imaging data sets achieved 94.4% diagnosis accuracy (sensitivity = 88.9% and specificity = 100%), which indicate the promising results of the presented computer-aided diagnostic system. PMID:29804518

  9. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    NASA Astrophysics Data System (ADS)

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  10. Rapidity window dependences of higher order cumulants and diffusion master equation

    NASA Astrophysics Data System (ADS)

    Kitazawa, Masakiyo

    2015-10-01

    We study the rapidity window dependences of higher order cumulants of conserved charges observed in relativistic heavy ion collisions. The time evolution and the rapidity window dependence of the non-Gaussian fluctuations are described by the diffusion master equation. Analytic formulas for the time evolution of cumulants in a rapidity window are obtained for arbitrary initial conditions. We discuss that the rapidity window dependences of the non-Gaussian cumulants have characteristic structures reflecting the non-equilibrium property of fluctuations, which can be observed in relativistic heavy ion collisions with the present detectors. It is argued that various information on the thermal and transport properties of the hot medium can be revealed experimentally by the study of the rapidity window dependences, especially by the combined use, of the higher order cumulants. Formulas of higher order cumulants for a probability distribution composed of sub-probabilities, which are useful for various studies of non-Gaussian cumulants, are also presented.

  11. Melanoma-specific mortality and competing mortality in patients with non-metastatic malignant melanoma: a population-based analysis.

    PubMed

    Shen, Weidong; Sakamoto, Naoko; Yang, Limin

    2016-07-07

    The objectives of this study were to evaluate and model the probability of melanoma-specific death and competing causes of death for patients with melanoma by competing risk analysis, and to build competing risk nomograms to provide individualized and accurate predictive tools. Melanoma data were obtained from the Surveillance Epidemiology and End Results program. All patients diagnosed with primary non-metastatic melanoma during the years 2004-2007 were potentially eligible for inclusion. The cumulative incidence function (CIF) was used to describe the probability of melanoma mortality and competing risk mortality. We used Gray's test to compare differences in CIF between groups. The proportional subdistribution hazard approach by Fine and Gray was used to model CIF. We built competing risk nomograms based on the models that we developed. The 5-year cumulative incidence of melanoma death was 7.1 %, and the cumulative incidence of other causes of death was 7.4 %. We identified that variables associated with an elevated probability of melanoma-specific mortality included older age, male sex, thick melanoma, ulcerated cancer, and positive lymph nodes. The nomograms were well calibrated. C-indexes were 0.85 and 0.83 for nomograms predicting the probability of melanoma mortality and competing risk mortality, which suggests good discriminative ability. This large study cohort enabled us to build a reliable competing risk model and nomogram for predicting melanoma prognosis. Model performance proved to be good. This individualized predictive tool can be used in clinical practice to help treatment-related decision making.

  12. Vacuum quantum stress tensor fluctuations: A diagonalization approach

    NASA Astrophysics Data System (ADS)

    Schiappacasse, Enrico D.; Fewster, Christopher J.; Ford, L. H.

    2018-01-01

    Large vacuum fluctuations of a quantum stress tensor can be described by the asymptotic behavior of its probability distribution. Here we focus on stress tensor operators which have been averaged with a sampling function in time. The Minkowski vacuum state is not an eigenstate of the time-averaged operator, but can be expanded in terms of its eigenstates. We calculate the probability distribution and the cumulative probability distribution for obtaining a given value in a measurement of the time-averaged operator taken in the vacuum state. In these calculations, we study a specific operator that contributes to the stress-energy tensor of a massless scalar field in Minkowski spacetime, namely, the normal ordered square of the time derivative of the field. We analyze the rate of decrease of the tail of the probability distribution for different temporal sampling functions, such as compactly supported functions and the Lorentzian function. We find that the tails decrease relatively slowly, as exponentials of fractional powers, in agreement with previous work using the moments of the distribution. Our results lend additional support to the conclusion that large vacuum stress tensor fluctuations are more probable than large thermal fluctuations, and may have observable effects.

  13. CUMBIN - CUMULATIVE BINOMIAL PROGRAMS

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.

  14. Modeling pore corrosion in normally open gold- plated copper connectors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Battaile, Corbett Chandler; Moffat, Harry K.; Sun, Amy Cha-Tien

    2008-09-01

    The goal of this study is to model the electrical response of gold plated copper electrical contacts exposed to a mixed flowing gas stream consisting of air containing 10 ppb H{sub 2}S at 30 C and a relative humidity of 70%. This environment accelerates the attack normally observed in a light industrial environment (essentially a simplified version of the Battelle Class 2 environment). Corrosion rates were quantified by measuring the corrosion site density, size distribution, and the macroscopic electrical resistance of the aged surface as a function of exposure time. A pore corrosion numerical model was used to predict bothmore » the growth of copper sulfide corrosion product which blooms through defects in the gold layer and the resulting electrical contact resistance of the aged surface. Assumptions about the distribution of defects in the noble metal plating and the mechanism for how corrosion blooms affect electrical contact resistance were needed to complete the numerical model. Comparisons are made to the experimentally observed number density of corrosion sites, the size distribution of corrosion product blooms, and the cumulative probability distribution of the electrical contact resistance. Experimentally, the bloom site density increases as a function of time, whereas the bloom size distribution remains relatively independent of time. These two effects are included in the numerical model by adding a corrosion initiation probability proportional to the surface area along with a probability for bloom-growth extinction proportional to the corrosion product bloom volume. The cumulative probability distribution of electrical resistance becomes skewed as exposure time increases. While the electrical contact resistance increases as a function of time for a fraction of the bloom population, the median value remains relatively unchanged. In order to model this behavior, the resistance calculated for large blooms has been weighted more heavily.« less

  15. High throughput nonparametric probability density estimation.

    PubMed

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  16. High throughput nonparametric probability density estimation

    PubMed Central

    Farmer, Jenny

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803

  17. Probability of stress-corrosion fracture under random loading.

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1972-01-01

    A method is developed for predicting the probability of stress-corrosion fracture of structures under random loadings. The formulation is based on the cumulative damage hypothesis and the experimentally determined stress-corrosion characteristics. Under both stationary and nonstationary random loadings, the mean value and the variance of the cumulative damage are obtained. The probability of stress-corrosion fracture is then evaluated using the principle of maximum entropy. It is shown that, under stationary random loadings, the standard deviation of the cumulative damage increases in proportion to the square root of time, while the coefficient of variation (dispersion) decreases in inversed proportion to the square root of time. Numerical examples are worked out to illustrate the general results.

  18. On Connected Diagrams and Cumulants of Erdős-Rényi Matrix Models

    NASA Astrophysics Data System (ADS)

    Khorunzhiy, O.

    2008-08-01

    Regarding the adjacency matrices of n-vertex graphs and related graph Laplacian we introduce two families of discrete matrix models constructed both with the help of the Erdős-Rényi ensemble of random graphs. Corresponding matrix sums represent the characteristic functions of the average number of walks and closed walks over the random graph. These sums can be considered as discrete analogues of the matrix integrals of random matrix theory. We study the diagram structure of the cumulant expansions of logarithms of these matrix sums and analyze the limiting expressions as n → ∞ in the cases of constant and vanishing edge probabilities.

  19. Higher-order cumulants and spectral kurtosis for early detection of subterranean termites

    NASA Astrophysics Data System (ADS)

    de la Rosa, Juan José González; Moreno Muñoz, Antonio

    2008-02-01

    This paper deals with termite detection in non-favorable SNR scenarios via signal processing using higher-order statistics. The results could be extrapolated to all impulse-like insect emissions; the situation involves non-destructive termite detection. Fourth-order cumulants in time and frequency domains enhance the detection and complete the characterization of termite emissions, non-Gaussian in essence. Sliding higher-order cumulants offer distinctive time instances, as a complement to the sliding variance, which only reveal power excesses in the signal; even for low-amplitude impulses. The spectral kurtosis reveals non-Gaussian characteristics (the peakedness of the probability density function) associated to these non-stationary measurements, specially in the near ultrasound frequency band. Contrasted estimators have been used to compute the higher-order statistics. The inedited findings are shown via graphical examples.

  20. Cumulative probability of neodymium: YAG laser posterior capsulotomy after phacoemulsification.

    PubMed

    Ando, Hiroshi; Ando, Nobuyo; Oshika, Tetsuro

    2003-11-01

    To retrospectively analyze the cumulative probability of neodymium:YAG (Nd:YAG) laser posterior capsulotomy after phacoemulsification and to evaluate the risk factors. Ando Eye Clinic, Kanagawa, Japan. In 3997 eyes that had phacoemulsification with an intact continuous curvilinear capsulorhexis, the cumulative probability of posterior capsulotomy was computed by Kaplan-Meier survival analysis and risk factors were analyzed using the Cox proportional hazards regression model. The variables tested were sex; age; type of cataract; preoperative best corrected visual acuity (BCVA); presence of diabetes mellitus, diabetic retinopathy, or retinitis pigmentosa; type of intraocular lens (IOL); and the year the operation was performed. The IOLs were categorized as 3-piece poly(methyl methacrylate) (PMMA), 1-piece PMMA, 3-piece silicone, and acrylic foldable. The cumulative probability of capsulotomy after cataract surgery was 1.95%, 18.50%, and 32.70% at 1, 3, and 5 years, respectively. Positive risk factors included a better preoperative BCVA (P =.0005; risk ratio [RR], 1.7; 95% confidence interval [CI], 1.3-2.5) and the presence of retinitis pigmentosa (P<.0001; RR, 6.6; 95% CI, 3.7-11.6). Women had a significantly greater probability of Nd:YAG laser posterior capsulotomy (P =.016; RR, 1.4; 95% CI, 1.1-1.8). The type of IOL was significantly related to the probability of Nd:YAG laser capsulotomy, with the foldable acrylic IOL having a significantly lower probability of capsulotomy. The 1-piece PMMA IOL had a significantly higher risk than 3-piece PMMA and 3-piece silicone IOLs. The probability of Nd:YAG laser capsulotomy was higher in women, in eyes with a better preoperative BCVA, and in patients with retinitis pigmentosa. The foldable acrylic IOL had a significantly lower probability of capsulotomy.

  1. The influence of deployment stress and life stress on Post-Traumatic Stress Disorder (PTSD) diagnosis among military personnel.

    PubMed

    Brownlow, Janeese A; Zitnik, Gerard A; McLean, Carmen P; Gehrman, Philip R

    2018-05-08

    There is increasing recognition that traumatic stress encountered throughout life, including those prior to military service, can put individuals at increased risk for developing Posttraumatic Stress Disorder (PTSD). The purpose of this study was to examine the association of both traumatic stress encountered during deployment, and traumatic stress over one's lifetime on probable PTSD diagnosis. Probable PTSD diagnosis was compared between military personnel deployed in Operation Iraqi Freedom/Operation Enduring Freedom (OIF/OEF; N = 21,499) and those who have recently enlisted (N = 55,814), using data obtained from the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). Probable PTSD diagnosis was assessed using the PTSD Checklist. The effect of exposure to multiple types (i.e. diversity) of traumatic stress and the total quantity (i.e. cumulative) of traumatic stress on probable PTSD diagnosis was also compared. Military personnel who had been deployed experienced higher rates of PTSD symptoms than new soldiers. Diversity of lifetime traumatic stress predicted probable PTSD diagnosis in both groups, whereas cumulative lifetime traumatic stress only predicted probable PTSD for those who had been deployed. For deployed soldiers, having been exposed to various types of traumatic stress during deployment predicted probable PTSD diagnosis, but cumulative deployment-related traumatic stress did not. Similarly, the total quantity of traumatic stress (i.e. cumulative lifetime traumatic stress) did not predict probable PTSD diagnosis among new soldiers. Together, traumatic stress over one's lifetime is a predictor of probable PTSD for veterans, as much as traumatic stress encountered during war. Clinicians treating military personnel with PTSD should be aware of the impact of traumatic stress beyond what occurs during war. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. On Orbital Elements of Extrasolar Planetary Candidates and Spectroscopic Binaries

    NASA Technical Reports Server (NTRS)

    Stepinski, T. F.; Black, D. C.

    2001-01-01

    We estimate probability densities of orbital elements, periods, and eccentricities, for the population of extrasolar planetary candidates (EPC) and, separately, for the population of spectroscopic binaries (SB) with solar-type primaries. We construct empirical cumulative distribution functions (CDFs) in order to infer probability distribution functions (PDFs) for orbital periods and eccentricities. We also derive a joint probability density for period-eccentricity pairs in each population. Comparison of respective distributions reveals that in all cases EPC and SB populations are, in the context of orbital elements, indistinguishable from each other to a high degree of statistical significance. Probability densities of orbital periods in both populations have P(exp -1) functional form, whereas the PDFs of eccentricities can he best characterized as a Gaussian with a mean of about 0.35 and standard deviation of about 0.2 turning into a flat distribution at small values of eccentricity. These remarkable similarities between EPC and SB must be taken into account by theories aimed at explaining the origin of extrasolar planetary candidates, and constitute an important clue us to their ultimate nature.

  3. Investigation of the relation between the return periods of major drought characteristics using copula functions

    NASA Astrophysics Data System (ADS)

    Hüsami Afşar, Mehdi; Unal Şorman, Ali; Tugrul Yilmaz, Mustafa

    2016-04-01

    Different drought characteristics (e.g. duration, average severity, and average areal extent) often have monotonic relation that increased magnitude of one often follows a similar increase in the magnitude of the other drought characteristic. Hence it is viable to establish a relationship between different drought characteristics with the goal of predicting one using other ones. Copula functions that relate different variables using their joint and conditional cumulative probability distributions are often used to statistically model the drought characteristics. In this study bivariate and trivariate joint probabilities of these characteristics are obtained over Ankara (Turkey) between 1960 and 2013. Copula-based return period estimation of drought characteristics of duration, average severity, and average areal extent show joint probabilities of these characteristics can be satisfactorily achieved. Among different copula families investigated in this study, elliptical family (i.e. including normal and t-student copula functions) resulted in the lowest root mean square error. "This study was supported by TUBITAK fund #114Y676)."

  4. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  5. A Markov chain technique for determining the acquisition behavior of a digital tracking loop

    NASA Technical Reports Server (NTRS)

    Chadwick, H. D.

    1972-01-01

    An iterative procedure is presented for determining the acquisition behavior of discrete or digital implementations of a tracking loop. The technique is based on the theory of Markov chains and provides the cumulative probability of acquisition in the loop as a function of time in the presence of noise and a given set of initial condition probabilities. A digital second-order tracking loop to be used in the Viking command receiver for continuous tracking of the command subcarrier phase was analyzed using this technique, and the results agree closely with experimental data.

  6. Asymptotic behavior of the daily increment distribution of the IPC, the mexican stock market index

    NASA Astrophysics Data System (ADS)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.

    2005-02-01

    In this work, a statistical analysis of the distribution of daily fluctuations of the IPC, the Mexican Stock Market Index is presented. A sample of the IPC covering the 13-year period 04/19/1990 - 08/21/2003 was analyzed and the cumulative probability distribution of its daily logarithmic variations studied. Results showed that the cumulative distribution function for extreme variations, can be described by a Pareto-Levy model with shape parameters alpha=3.634 +- 0.272 and alpha=3.540 +- 0.278 for its positive and negative tails respectively. This result is consistent with previous studies, where it has been found that 2.5< alpha <4 for other financial markets worldwide.

  7. Survival analysis for the missing censoring indicator model using kernel density estimation techniques

    PubMed Central

    Subramanian, Sundarraman

    2008-01-01

    This article concerns asymptotic theory for a new estimator of a survival function in the missing censoring indicator model of random censorship. Specifically, the large sample results for an inverse probability-of-non-missingness weighted estimator of the cumulative hazard function, so far not available, are derived, including an almost sure representation with rate for a remainder term, and uniform strong consistency with rate of convergence. The estimator is based on a kernel estimate for the conditional probability of non-missingness of the censoring indicator. Expressions for its bias and variance, in turn leading to an expression for the mean squared error as a function of the bandwidth, are also obtained. The corresponding estimator of the survival function, whose weak convergence is derived, is asymptotically efficient. A numerical study, comparing the performances of the proposed and two other currently existing efficient estimators, is presented. PMID:18953423

  8. Survival analysis for the missing censoring indicator model using kernel density estimation techniques.

    PubMed

    Subramanian, Sundarraman

    2006-01-01

    This article concerns asymptotic theory for a new estimator of a survival function in the missing censoring indicator model of random censorship. Specifically, the large sample results for an inverse probability-of-non-missingness weighted estimator of the cumulative hazard function, so far not available, are derived, including an almost sure representation with rate for a remainder term, and uniform strong consistency with rate of convergence. The estimator is based on a kernel estimate for the conditional probability of non-missingness of the censoring indicator. Expressions for its bias and variance, in turn leading to an expression for the mean squared error as a function of the bandwidth, are also obtained. The corresponding estimator of the survival function, whose weak convergence is derived, is asymptotically efficient. A numerical study, comparing the performances of the proposed and two other currently existing efficient estimators, is presented.

  9. Embolic Strokes of Undetermined Source in the Athens Stroke Registry: An Outcome Analysis.

    PubMed

    Ntaios, George; Papavasileiou, Vasileios; Milionis, Haralampos; Makaritsis, Konstantinos; Vemmou, Anastasia; Koroboki, Eleni; Manios, Efstathios; Spengos, Konstantinos; Michel, Patrik; Vemmos, Konstantinos

    2015-08-01

    Information about outcomes in Embolic Stroke of Undetermined Source (ESUS) patients is unavailable. This study provides a detailed analysis of outcomes of a large ESUS population. Data set was derived from the Athens Stroke Registry. ESUS was defined according to the Cryptogenic Stroke/ESUS International Working Group criteria. End points were mortality, stroke recurrence, functional outcome, and a composite cardiovascular end point comprising recurrent stroke, myocardial infarction, aortic aneurysm rupture, systemic embolism, or sudden cardiac death. We performed Kaplan-Meier analyses to estimate cumulative probabilities of outcomes by stroke type and Cox-regression to investigate whether stroke type was outcome predictor. 2731 patients were followed-up for a mean of 30.5±24.1months. There were 73 (26.5%) deaths, 60 (21.8%) recurrences, and 78 (28.4%) composite cardiovascular end points in the 275 ESUS patients. The cumulative probability of survival in ESUS was 65.6% (95% confidence intervals [CI], 58.9%-72.2%), significantly higher compared with cardioembolic stroke (38.8%, 95% CI, 34.9%-42.7%). The cumulative probability of stroke recurrence in ESUS was 29.0% (95% CI, 22.3%-35.7%), similar to cardioembolic strokes (26.8%, 95% CI, 22.1%-31.5%), but significantly higher compared with all types of noncardioembolic stroke. One hundred seventy-two (62.5%) ESUS patients had favorable functional outcome compared with 280 (32.2%) in cardioembolic and 303 (60.9%) in large-artery atherosclerotic. ESUS patients had similar risk of composite cardiovascular end point as all other stroke types, with the exception of lacunar strokes, which had significantly lower risk (adjusted hazard ratio, 0.70 [95% CI, 0.52-0.94]). Long-term mortality risk in ESUS is lower compared with cardioembolic strokes, despite similar rates of recurrence and composite cardiovascular end point. Recurrent stroke risk is higher in ESUS than in noncardioembolic strokes. © 2015 American Heart Association, Inc.

  10. Probabilistic analysis of preload in the abutment screw of a dental implant complex.

    PubMed

    Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R

    2008-09-01

    Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment screw at high values of the preload CDF. Lubrication at the threaded surfaces between the abutment screw and implant bore affects the preload developed in the implant complex. For the well-lubricated surfaces, only approximately 50% of implants will have preload values within the generally accepted range. This probability can be improved by applying a higher torque than normally recommended or a more closely controlled torque than typically achieved. It is also suggested that materials with higher elastic moduli be used in the manufacture of the abutment screw to achieve a higher preload.

  11. Applicability of AgMERRA Forcing Dataset to Fill Gaps in Historical in-situ Meteorological Data

    NASA Astrophysics Data System (ADS)

    Bannayan, M.; Lashkari, A.; Zare, H.; Asadi, S.; Salehnia, N.

    2015-12-01

    Integrated assessment studies of food production systems use crop models to simulate the effects of climate and socio-economic changes on food security. Climate forcing data is one of those key inputs of crop models. This study evaluated the performance of AgMERRA climate forcing dataset to fill gaps in historical in-situ meteorological data for different climatic regions of Iran. AgMERRA dataset intercompared with in- situ observational dataset for daily maximum and minimum temperature and precipitation during 1980-2010 periods via Root Mean Square error (RMSE), Mean Absolute Error (MAE) and Mean Bias Error (MBE) for 17 stations in four climatic regions included humid and moderate, cold, dry and arid, hot and humid. Moreover, probability distribution function and cumulative distribution function compared between model and observed data. The results of measures of agreement between AgMERRA data and observed data demonstrated that there are small errors in model data for all stations. Except for stations which are located in cold regions, model data in the other stations illustrated under-prediction for daily maximum temperature and precipitation. However, it was not significant. In addition, probability distribution function and cumulative distribution function showed the same trend for all stations between model and observed data. Therefore, the reliability of AgMERRA dataset is high to fill gaps in historical observations in different climatic regions of Iran as well as it could be applied as a basis for future climate scenarios.

  12. Detection of the nipple in automated 3D breast ultrasound using coronal slab-average-projection and cumulative probability map

    NASA Astrophysics Data System (ADS)

    Kim, Hannah; Hong, Helen

    2014-03-01

    We propose an automatic method for nipple detection on 3D automated breast ultrasound (3D ABUS) images using coronal slab-average-projection and cumulative probability map. First, to identify coronal images that appeared remarkable distinction between nipple-areola region and skin, skewness of each coronal image is measured and the negatively skewed images are selected. Then, coronal slab-average-projection image is reformatted from selected images. Second, to localize nipple-areola region, elliptical ROI covering nipple-areola region is detected using Hough ellipse transform in coronal slab-average-projection image. Finally, to separate the nipple from areola region, 3D Otsu's thresholding is applied to the elliptical ROI and cumulative probability map in the elliptical ROI is generated by assigning high probability to low intensity region. False detected small components are eliminated using morphological opening and the center point of detected nipple region is calculated. Experimental results show that our method provides 94.4% nipple detection rate.

  13. Computing thermal Wigner densities with the phase integration method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beutier, J.; Borgis, D.; Vuilleumier, R.

    2014-08-28

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta andmore » coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems.« less

  14. Computing thermal Wigner densities with the phase integration method.

    PubMed

    Beutier, J; Borgis, D; Vuilleumier, R; Bonella, S

    2014-08-28

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta and coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems.

  15. Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barajas-Solano, David A.; Tartakovsky, Alexandre M.

    We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advectivemore » dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.« less

  16. Stochastic Models of Emerging Infectious Disease Transmission on Adaptive Random Networks

    PubMed Central

    Pipatsart, Navavat; Triampo, Wannapong

    2017-01-01

    We presented adaptive random network models to describe human behavioral change during epidemics and performed stochastic simulations of SIR (susceptible-infectious-recovered) epidemic models on adaptive random networks. The interplay between infectious disease dynamics and network adaptation dynamics was investigated in regard to the disease transmission and the cumulative number of infection cases. We found that the cumulative case was reduced and associated with an increasing network adaptation probability but was increased with an increasing disease transmission probability. It was found that the topological changes of the adaptive random networks were able to reduce the cumulative number of infections and also to delay the epidemic peak. Our results also suggest the existence of a critical value for the ratio of disease transmission and adaptation probabilities below which the epidemic cannot occur. PMID:29075314

  17. Multivariate η-μ fading distribution with arbitrary correlation model

    NASA Astrophysics Data System (ADS)

    Ghareeb, Ibrahim; Atiani, Amani

    2018-03-01

    An extensive analysis for the multivariate ? distribution with arbitrary correlation is presented, where novel analytical expressions for the multivariate probability density function, cumulative distribution function and moment generating function (MGF) of arbitrarily correlated and not necessarily identically distributed ? power random variables are derived. Also, this paper provides exact-form expression for the MGF of the instantaneous signal-to-noise ratio at the combiner output in a diversity reception system with maximal-ratio combining and post-detection equal-gain combining operating in slow frequency nonselective arbitrarily correlated not necessarily identically distributed ?-fading channels. The average bit error probability of differentially detected quadrature phase shift keying signals with post-detection diversity reception system over arbitrarily correlated and not necessarily identical fading parameters ?-fading channels is determined by using the MGF-based approach. The effect of fading correlation between diversity branches, fading severity parameters and diversity level is studied.

  18. Statistical properties of two sine waves in Gaussian noise.

    NASA Technical Reports Server (NTRS)

    Esposito, R.; Wilson, L. R.

    1973-01-01

    A detailed study is presented of some statistical properties of a stochastic process that consists of the sum of two sine waves of unknown relative phase and a normal process. Since none of the statistics investigated seem to yield a closed-form expression, all the derivations are cast in a form that is particularly suitable for machine computation. Specifically, results are presented for the probability density function (pdf) of the envelope and the instantaneous value, the moments of these distributions, and the relative cumulative density function (cdf).

  19. NEWTONP - CUMULATIVE BINOMIAL PROGRAMS

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.

  20. An improved probabilistic approach for linking progenitor and descendant galaxy populations using comoving number density

    NASA Astrophysics Data System (ADS)

    Wellons, Sarah; Torrey, Paul

    2017-06-01

    Galaxy populations at different cosmic epochs are often linked by cumulative comoving number density in observational studies. Many theoretical works, however, have shown that the cumulative number densities of tracked galaxy populations not only evolve in bulk, but also spread out over time. We present a method for linking progenitor and descendant galaxy populations which takes both of these effects into account. We define probability distribution functions that capture the evolution and dispersion of galaxy populations in number density space, and use these functions to assign galaxies at redshift zf probabilities of being progenitors/descendants of a galaxy population at another redshift z0. These probabilities are used as weights for calculating distributions of physical progenitor/descendant properties such as stellar mass, star formation rate or velocity dispersion. We demonstrate that this probabilistic method provides more accurate predictions for the evolution of physical properties than the assumption of either a constant number density or an evolving number density in a bin of fixed width by comparing predictions against galaxy populations directly tracked through a cosmological simulation. We find that the constant number density method performs least well at recovering galaxy properties, the evolving method density slightly better and the probabilistic method best of all. The improvement is present for predictions of stellar mass as well as inferred quantities such as star formation rate and velocity dispersion. We demonstrate that this method can also be applied robustly and easily to observational data, and provide a code package for doing so.

  1. Newton/Poisson-Distribution Program

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Scheuer, Ernest M.

    1990-01-01

    NEWTPOIS, one of two computer programs making calculations involving cumulative Poisson distributions. NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714) used independently of one another. NEWTPOIS determines Poisson parameter for given cumulative probability, from which one obtains percentiles for gamma distributions with integer shape parameters and percentiles for X(sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Program written in C.

  2. Application of Maxent Multivariate Analysis to Define Climate-Change Effects on Species Distributions and Changes

    DTIC Science & Technology

    2014-09-01

    approaches. Ecological Modelling Volume 200, Issues 1–2, 10, pp 1–19. Buhlmann, Kurt A ., Thomas S.B. Akre , John B. Iverson, Deno Karapatakis, Russell A ...statistical multivariate analysis to define the current and projected future range probability for species of interest to Army land managers. A software...15 Figure 4. RCW omission rate and predicted area as a function of the cumulative threshold

  3. Fire frequency, area burned, and severity: A quantitative approach to defining a normal fire year

    USGS Publications Warehouse

    Lutz, J.A.; Key, C.H.; Kolden, C.A.; Kane, J.T.; van Wagtendonk, J.W.

    2011-01-01

    Fire frequency, area burned, and fire severity are important attributes of a fire regime, but few studies have quantified the interrelationships among them in evaluating a fire year. Although area burned is often used to summarize a fire season, burned area may not be well correlated with either the number or ecological effect of fires. Using the Landsat data archive, we examined all 148 wildland fires (prescribed fires and wildfires) >40 ha from 1984 through 2009 for the portion of the Sierra Nevada centered on Yosemite National Park, California, USA. We calculated mean fire frequency and mean annual area burned from a combination of field- and satellite-derived data. We used the continuous probability distribution of the differenced Normalized Burn Ratio (dNBR) values to describe fire severity. For fires >40 ha, fire frequency, annual area burned, and cumulative severity were consistent in only 13 of 26 years (50 %), but all pair-wise comparisons among these fire regime attributes were significant. Borrowing from long-established practice in climate science, we defined "fire normals" to be the 26 year means of fire frequency, annual area burned, and the area under the cumulative probability distribution of dNBR. Fire severity normals were significantly lower when they were aggregated by year compared to aggregation by area. Cumulative severity distributions for each year were best modeled with Weibull functions (all 26 years, r2 ??? 0.99; P < 0.001). Explicit modeling of the cumulative severity distributions may allow more comprehensive modeling of climate-severity and area-severity relationships. Together, the three metrics of number of fires, size of fires, and severity of fires provide land managers with a more comprehensive summary of a given fire year than any single metric.

  4. Optimal methods for fitting probability distributions to propagule retention time in studies of zoochorous dispersal.

    PubMed

    Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi

    2016-02-01

    Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We recommend the use of cumulative probability to fit parametric probability distributions to propagule retention time, specifically using maximum likelihood for parameter estimation. Furthermore, the experimental design for an optimal characterization of unimodal propagule retention time should contemplate at least 500 recovered propagules and sampling time-intervals not larger than the time peak of propagule retrieval, except in the tail of the distribution where broader sampling time-intervals may also produce accurate fits.

  5. Economic decision-making compared with an equivalent motor task.

    PubMed

    Wu, Shih-Wei; Delgado, Mauricio R; Maloney, Laurence T

    2009-04-14

    There is considerable evidence that human economic decision-making deviates from the predictions of expected utility theory (EUT) and that human performance conforms to EUT in many perceptual and motor decision tasks. It is possible that these results reflect a real difference in decision-making in the 2 domains but it is also possible that the observed discrepancy simply reflects typical differences in experimental design. We developed a motor task that is mathematically equivalent to choosing between lotteries and used it to compare how the same subject chose between classical economic lotteries and the same lotteries presented in equivalent motor form. In experiment 1, we found that subjects are more risk seeking in deciding between motor lotteries. In experiment 2, we used cumulative prospect theory to model choice and separately estimated the probability weighting functions and the value functions for each subject carrying out each task. We found no patterned differences in how subjects represented outcome value in the motor and the classical tasks. However, the probability weighting functions for motor and classical tasks were markedly and significantly different. Those for the classical task showed a typical tendency to overweight small probabilities and underweight large probabilities, and those for the motor task showed the opposite pattern of probability distortion. This outcome also accounts for the increased risk-seeking observed in the motor tasks of experiment 1. We conclude that the same subject distorts probability, but not value, differently in making identical decisions in motor and classical form.

  6. Transport of polar and non-polar solvents through a carbon nanotube

    NASA Astrophysics Data System (ADS)

    Chopra, Manish; Phatak, Rohan; Choudhury, N.

    2013-02-01

    Transport of water through narrow pores is important in chemistry, biology and material science. In this work, we employ atomistic molecular dynamics (MD) simulations to carry out a comparative study of the transport of a polar and a non-polar solvent through a carbon nanotube (CNT). The flow of water as well as methane through the nanotube is estimated in terms of number of translocation events and is compared. Transport events occurred in bursts of unidirectional translocation pulses in both the cases. Probability density and cumulative probability distribution functions are obtained for the translocated particles and particles coming out from same side with respect to the time they spent in the nano channel.

  7. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  8. Probabilistic material degradation model for aerospace materials subjected to high temperature, mechanical and thermal fatigue, and creep

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1992-01-01

    A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.

  9. Design for cyclic loading endurance of composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.

    1993-01-01

    The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.

  10. The statistical theory of the fracture of fragile bodies. Part 2: The integral equation method

    NASA Technical Reports Server (NTRS)

    Kittl, P.

    1984-01-01

    It is demonstrated how with the aid of a bending test, the Weibull fracture risk function can be determined - without postulating its analytical form - by resolving an integral equation. The respective solutions for rectangular and circular section beams are given. In the first case the function is expressed as an algorithm and in the second, in the form of series. Taking into account that the cumulative fracture probability appearing in the solution to the integral equation must be continuous and monotonically increasing, any case of fabrication or selection of samples can be treated.

  11. Cumulants, free cumulants and half-shuffles

    PubMed Central

    Ebrahimi-Fard, Kurusch; Patras, Frédéric

    2015-01-01

    Free cumulants were introduced as the proper analogue of classical cumulants in the theory of free probability. There is a mix of similarities and differences, when one considers the two families of cumulants. Whereas the combinatorics of classical cumulants is well expressed in terms of set partitions, that of free cumulants is described and often introduced in terms of non-crossing set partitions. The formal series approach to classical and free cumulants also largely differs. The purpose of this study is to put forward a different approach to these phenomena. Namely, we show that cumulants, whether classical or free, can be understood in terms of the algebra and combinatorics underlying commutative as well as non-commutative (half-)shuffles and (half-) unshuffles. As a corollary, cumulants and free cumulants can be characterized through linear fixed point equations. We study the exponential solutions of these linear fixed point equations, which display well the commutative, respectively non-commutative, character of classical and free cumulants. PMID:27547078

  12. Electrofishing capture probability of smallmouth bass in streams

    USGS Publications Warehouse

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  13. Dropout rates in medical students at one school before and after the installation of admission tests in Austria.

    PubMed

    Reibnegger, Gilbert; Caluba, Hans-Christian; Ithaler, Daniel; Manhal, Simone; Neges, Heide Maria; Smolle, Josef

    2011-08-01

    Admission to medical studies in Austria since academic year 2005-2006 has been regulated by admission tests. At the Medical University of Graz, an admission test focusing on secondary-school-level knowledge in natural sciences has been used for this purpose. The impact of this important change on dropout rates of female versus male students and older versus younger students is reported. All 2,860 students admitted to the human medicine diploma program at the Medical University of Graz from academic years 2002-2003 to 2008-2009 were included. Nonparametric and semiparametric survival analysis techniques were employed to compare cumulative probability of dropout between demographic groups. Cumulative probability of dropout was significantly reduced in students selected by active admission procedure versus those admitted openly (P < .0001). Relative hazard ratio of selected versus openly admitted students was only 0.145 (95% CI, 0.106-0.198). Among openly admitted students, but not for selected ones, the cumulative probabilities for dropout were higher for females (P < .0001) and for older students (P < .0001). Generally, dropout hazard is highest during the second year of study. The introduction of admission testing significantly decreased the cumulative probability for dropout. In openly admitted students a significantly higher risk for dropout was found in female students and in older students, whereas no such effects can be detected after admission testing. Future research should focus on the sex dependence, with the aim of improving success rates among female applicants on the admission tests.

  14. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  15. Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; Shah, Ashwin R.

    1997-01-01

    The properties of ceramic matrix composites (CMC's) are known to display a considerable amount of scatter due to variations in fiber/matrix properties, interphase properties, interphase bonding, amount of matrix voids, and many geometry- or fabrication-related parameters, such as ply thickness and ply orientation. This paper summarizes preliminary studies in which formal probabilistic descriptions of the material-behavior- and fabrication-related parameters were incorporated into micromechanics and macromechanics for CMC'S. In this process two existing methodologies, namely CMC micromechanics and macromechanics analysis and a fast probability integration (FPI) technique are synergistically coupled to obtain the probabilistic composite behavior or response. Preliminary results in the form of cumulative probability distributions and information on the probability sensitivities of the response to primitive variables for a unidirectional silicon carbide/reaction-bonded silicon nitride (SiC/RBSN) CMC are presented. The cumulative distribution functions are computed for composite moduli, thermal expansion coefficients, thermal conductivities, and longitudinal tensile strength at room temperature. The variations in the constituent properties that directly affect these composite properties are accounted for via assumed probabilistic distributions. Collectively, the results show that the present technique provides valuable information about the composite properties and sensitivity factors, which is useful to design or test engineers. Furthermore, the present methodology is computationally more efficient than a standard Monte-Carlo simulation technique; and the agreement between the two solutions is excellent, as shown via select examples.

  16. Sino-implant (II) - a levonorgestrel-releasing two-rod implant: systematic review of the randomized controlled trials

    PubMed Central

    Steiner, Markus J.; Lopez, Laureen M.; Grimes, David A.; Cheng, Linan; Shelton, Jim; Trussell, James; Farley, Timothy M.M.; Dorflinger, Laneta

    2013-01-01

    Background Sino-implant (II) is a subdermal contraceptive implant manufactured in China. This two-rod levonorgestrel-releasing implant has the same amount of active ingredient (150 mg levonorgestrel) and mechanism of action as the widely available contraceptive implant Jadelle. We examined randomized controlled trials of Sino-implant (II) for effectiveness and side effects. Study design We searched electronic databases for studies of Sino-implant (II), and then restricted our review to randomized controlled trials. The primary outcome of this review was pregnancy. Results Four randomized trials with a total of 15,943 women assigned to Sino-implant (II) had first-year probabilities of pregnancy ranging from 0.0% to 0.1%. Cumulative probabilities of pregnancy during the four years of the product's approved duration of use were 0.9% and 1.06% in the two trials that presented date for four-year use. Five-year cumulative probabilities of pregnancy ranged from 0.7% to 2.1%. In one trial, the cumulative probability of pregnancy more than doubled during the fifth year (from 0.9% to 2.1%), which may be why the implant is approved for four years of use in China. Five-year cumulative probabilities of discontinuation due to menstrual problems ranged from 12.5% to 15.5% for Sino-implant (II). Conclusions Sino-implant (II) is one of the most effective contraceptives available today. These available clinical data, combined with independent laboratory testing, and the knowledge that 7 million women have used this method since 1994, support the safety and effectiveness of Sino-implant (II). The lower cost of Sino-implant (II) compared with other subdermal implants could improve access to implants in resource-constrained settings. PMID:20159174

  17. Rain attenuation measurements: Variability and data quality assessment

    NASA Technical Reports Server (NTRS)

    Crane, Robert K.

    1989-01-01

    Year to year variations in the cumulative distributions of rain rate or rain attenuation are evident in any of the published measurements for a single propagation path that span a period of several years of observation. These variations must be described by models for the prediction of rain attenuation statistics. Now that a large measurement data base has been assembled by the International Radio Consultative Committee, the information needed to assess variability is available. On the basis of 252 sample cumulative distribution functions for the occurrence of attenuation by rain, the expected year to year variation in attenuation at a fixed probability level in the 0.1 to 0.001 percent of a year range is estimated to be 27 percent. The expected deviation from an attenuation model prediction for a single year of observations is estimated to exceed 33 percent when any of the available global rain climate model are employed to estimate the rain rate statistics. The probability distribution for the variation in attenuation or rain rate at a fixed fraction of a year is lognormal. The lognormal behavior of the variate was used to compile the statistics for variability.

  18. Statistics of cosmic density profiles from perturbation theory

    NASA Astrophysics Data System (ADS)

    Bernardeau, Francis; Pichon, Christophe; Codis, Sandrine

    2014-11-01

    The joint probability distribution function (PDF) of the density within multiple concentric spherical cells is considered. It is shown how its cumulant generating function can be obtained at tree order in perturbation theory as the Legendre transform of a function directly built in terms of the initial moments. In the context of the upcoming generation of large-scale structure surveys, it is conjectured that this result correctly models such a function for finite values of the variance. Detailed consequences of this assumption are explored. In particular the corresponding one-cell density probability distribution at finite variance is computed for realistic power spectra, taking into account its scale variation. It is found to be in agreement with Λ -cold dark matter simulations at the few percent level for a wide range of density values and parameters. Related explicit analytic expansions at the low and high density tails are given. The conditional (at fixed density) and marginal probability of the slope—the density difference between adjacent cells—and its fluctuations is also computed from the two-cell joint PDF; it also compares very well to simulations. It is emphasized that this could prove useful when studying the statistical properties of voids as it can serve as a statistical indicator to test gravity models and/or probe key cosmological parameters.

  19. Effects of ultraviolet radiation and contaminant-related stressors on arctic freshwater ecosystems.

    PubMed

    Wrona, Frederick J; Prowse, Terry D; Reist, James D; Hobbie, John E; Lévesque, Lucie M J; Macdonald, Robie W; Vincent, Warwick F

    2006-11-01

    Climate change is likely to act as a multiple stressor, leading to cumulative and/or synergistic impacts on aquatic systems. Projected increases in temperature and corresponding alterations in precipitation regimes will enhance contaminant influxes to aquatic systems, and independently increase the susceptibility of aquatic organisms to contaminant exposure and effects. The consequences for the biota will in most cases be additive (cumulative) and multiplicative (synergistic). The overall result will be higher contaminant loads and biomagnification in aquatic ecosystems. Changes in stratospheric ozone and corresponding ultraviolet radiation regimes are also expected to produce cumulative and/or synergistic effects on aquatic ecosystem structure and function. Reduced ice cover is likely to have a much greater effect on underwater UV radiation exposure than the projected levels of stratospheric ozone depletion. A major increase in UV radiation levels will cause enhanced damage to organisms (biomolecular, cellular, and physiological damage, and alterations in species composition). Allocations of energy and resources by aquatic biota to UV radiation protection will increase, probably decreasing trophic-level productivity. Elemental fluxes will increase via photochemical pathways.

  20. A Skill Score of Trajectory Model Evaluation Using Reinitialized Series of Normalized Cumulative Lagrangian Separation

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Weisberg, R. H.

    2017-12-01

    The Lagrangian separation distance between the endpoints of simulated and observed drifter trajectories is often used to assess the performance of numerical particle trajectory models. However, the separation distance fails to indicate relative model performance in weak and strong current regions, such as a continental shelf and its adjacent deep ocean. A skill score is proposed based on the cumulative Lagrangian separation distances normalized by the associated cumulative trajectory lengths. The new metrics correctly indicates the relative performance of the Global HYCOM in simulating the strong currents of the Gulf of Mexico Loop Current and the weaker currents of the West Florida Shelf in the eastern Gulf of Mexico. In contrast, the Lagrangian separation distance alone gives a misleading result. Also, the observed drifter position series can be used to reinitialize the trajectory model and evaluate its performance along the observed trajectory, not just at the drifter end position. The proposed dimensionless skill score is particularly useful when the number of drifter trajectories is limited and neither a conventional Eulerian-based velocity nor a Lagrangian-based probability density function may be estimated.

  1. The large-scale correlations of multicell densities and profiles: implications for cosmic variance estimates

    NASA Astrophysics Data System (ADS)

    Codis, Sandrine; Bernardeau, Francis; Pichon, Christophe

    2016-08-01

    In order to quantify the error budget in the measured probability distribution functions of cell densities, the two-point statistics of cosmic densities in concentric spheres is investigated. Bias functions are introduced as the ratio of their two-point correlation function to the two-point correlation of the underlying dark matter distribution. They describe how cell densities are spatially correlated. They are computed here via the so-called large deviation principle in the quasi-linear regime. Their large-separation limit is presented and successfully compared to simulations for density and density slopes: this regime is shown to be rapidly reached allowing to get sub-percent precision for a wide range of densities and variances. The corresponding asymptotic limit provides an estimate of the cosmic variance of standard concentric cell statistics applied to finite surveys. More generally, no assumption on the separation is required for some specific moments of the two-point statistics, for instance when predicting the generating function of cumulants containing any powers of concentric densities in one location and one power of density at some arbitrary distance from the rest. This exact `one external leg' cumulant generating function is used in particular to probe the rate of convergence of the large-separation approximation.

  2. Generalized Arcsine Laws for Fractional Brownian Motion

    NASA Astrophysics Data System (ADS)

    Sadhu, Tridib; Delorme, Mathieu; Wiese, Kay Jörg

    2018-01-01

    The three arcsine laws for Brownian motion are a cornerstone of extreme-value statistics. For a Brownian Bt starting from the origin, and evolving during time T , one considers the following three observables: (i) the duration t+ the process is positive, (ii) the time tlast the process last visits the origin, and (iii) the time tmax when it achieves its maximum (or minimum). All three observables have the same cumulative probability distribution expressed as an arcsine function, thus the name arcsine laws. We show how these laws change for fractional Brownian motion Xt, a non-Markovian Gaussian process indexed by the Hurst exponent H . It generalizes standard Brownian motion (i.e., H =1/2 ). We obtain the three probabilities using a perturbative expansion in ɛ =H -1/2 . While all three probabilities are different, this distinction can only be made at second order in ɛ . Our results are confirmed to high precision by extensive numerical simulations.

  3. Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2014-01-01

    The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.

  4. Conditioning from an information processing perspective.

    PubMed

    Gallistel, C R.

    2003-04-28

    The framework provided by Claude Shannon's [Bell Syst. Technol. J. 27 (1948) 623] theory of information leads to a quantitatively oriented reconceptualization of the processes that mediate conditioning. The focus shifts from processes set in motion by individual events to processes sensitive to the information carried by the flow of events. The conception of what properties of the conditioned and unconditioned stimuli are important shifts from the tangible properties to the intangible properties of number, duration, frequency and contingency. In this view, a stimulus becomes a CS if its onset substantially reduces the subject's uncertainty about the time of occurrence of the next US. One way to represent the subject's knowledge of that time of occurrence is by the cumulative probability function, which has two limiting forms: (1) The state of maximal uncertainty (minimal knowledge) is represented by the inverse exponential function for the random rate condition, in which the US is equally likely at any moment. (2) The limit to the subject's attainable certainty is represented by the cumulative normal function, whose momentary expectation is the CS-US latency minus the time elapsed since CS onset. Its standard deviation is the Weber fraction times the CS-US latency.

  5. Environmental assessment: geothermal energy geopressure subprogram. DOE Sweet Lake No. 1, Cameron Parish, Louisiana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-02-01

    The following are described: the proposed action; existing environment; probable impacts, direct and indirect; probable cumulative and long-term environmental impacts; accidents; coordination with federal, state, and local agencies; and alternatives. (MHR)

  6. Back in the saddle: large-deviation statistics of the cosmic log-density field

    NASA Astrophysics Data System (ADS)

    Uhlemann, C.; Codis, S.; Pichon, C.; Bernardeau, F.; Reimberg, P.

    2016-08-01

    We present a first principle approach to obtain analytical predictions for spherically averaged cosmic densities in the mildly non-linear regime that go well beyond what is usually achieved by standard perturbation theory. A large deviation principle allows us to compute the leading order cumulants of average densities in concentric cells. In this symmetry, the spherical collapse model leads to cumulant generating functions that are robust for finite variances and free of critical points when logarithmic density transformations are implemented. They yield in turn accurate density probability distribution functions (PDFs) from a straightforward saddle-point approximation valid for all density values. Based on this easy-to-implement modification, explicit analytic formulas for the evaluation of the one- and two-cell PDF are provided. The theoretical predictions obtained for the PDFs are accurate to a few per cent compared to the numerical integration, regardless of the density under consideration and in excellent agreement with N-body simulations for a wide range of densities. This formalism should prove valuable for accurately probing the quasi-linear scales of low-redshift surveys for arbitrary primordial power spectra.

  7. Adjacent-Categories Mokken Models for Rater-Mediated Assessments

    PubMed Central

    Wind, Stefanie A.

    2016-01-01

    Molenaar extended Mokken’s original probabilistic-nonparametric scaling models for use with polytomous data. These polytomous extensions of Mokken’s original scaling procedure have facilitated the use of Mokken scale analysis as an approach to exploring fundamental measurement properties across a variety of domains in which polytomous ratings are used, including rater-mediated educational assessments. Because their underlying item step response functions (i.e., category response functions) are defined using cumulative probabilities, polytomous Mokken models can be classified as cumulative models based on the classifications of polytomous item response theory models proposed by several scholars. In order to permit a closer conceptual alignment with educational performance assessments, this study presents an adjacent-categories variation on the polytomous monotone homogeneity and double monotonicity models. Data from a large-scale rater-mediated writing assessment are used to illustrate the adjacent-categories approach, and results are compared with the original formulations. Major findings suggest that the adjacent-categories models provide additional diagnostic information related to individual raters’ use of rating scale categories that is not observed under the original formulation. Implications are discussed in terms of methods for evaluating rating quality. PMID:29795916

  8. Martian Cratering 7: The Role of Impact Gardening

    NASA Astrophysics Data System (ADS)

    Hartmann, William K.; Anguita, Jorge; de la Casa, Miguel A.; Berman, Daniel C.; Ryan, Eileen V.

    2001-01-01

    Viking-era researchers concluded that impact craters of diameter D<50 m were absent on Mars, and thus impact gardening was considered negligible in establishing decameter-scale surface properties. This paper documents martian crater populations down to diameter D˜11 m and probably less on Mars, requiring a certain degree of impact gardening. Applying lunar data, we calculate cumulative gardening depth as a function of total cratering. Stratigraphic units exposed since Noachian times would have experienced tens to hundreds of meters of gardening. Early Amazonian/late Hesperian sites, such as the first three landing sites, experienced cumulative gardening on the order of 3-14 m, a conclusion that may conflict with some landing site interpretations. Martian surfaces with less than a percent or so of lunar mare crater densities have negligible impact gardening because of a probable cutoff of hypervelocity impact cratering below D˜1 m, due to Mars' atmosphere. Unlike lunar regolith, martian regolith has been affected, and fines removed, by many processes. Deflation may have been a factor in leaving widespread boulder fields and associated dune fields, observed by the first three landers. Ancient regolith provided a porous medium for water storage, subsurface transport, and massive permafrost formation. Older regolith was probably cemented by evaporites and permafrost, may contain interbedded sediments and lavas, and may have been brecciated by later impacts. Growing evidence suggests recent water mobility, and the existence of duricrust at Viking and Pathfinder sites demonstrates the cementing process. These results affect lander/rover searches for intact ancient deposits. The upper tens of meters of exposed Noachian units cannot survive today in a pristine state. Intact Noachian deposits might best be found in cliffside strata, or in recently exhumed regions. The hematite-rich areas found in Terra Meridiani by the Mars Global Surveyor are probably examples of the latter.

  9. On the distribution of a product of N Gaussian random variables

    NASA Astrophysics Data System (ADS)

    Stojanac, Željka; Suess, Daniel; Kliesch, Martin

    2017-08-01

    The product of Gaussian random variables appears naturally in many applications in probability theory and statistics. It has been known that the distribution of a product of N such variables can be expressed in terms of a Meijer G-function. Here, we compute a similar representation for the corresponding cumulative distribution function (CDF) and provide a power-log series expansion of the CDF based on the theory of the more general Fox H-functions. Numerical computations show that for small values of the argument the CDF of products of Gaussians is well approximated by the lowest orders of this expansion. Analogous results are also shown for the absolute value as well as the square of such products of N Gaussian random variables. For the latter two settings, we also compute the moment generating functions in terms of Meijer G-functions.

  10. Prognostic Factors in Severe Chagasic Heart Failure

    PubMed Central

    Costa, Sandra de Araújo; Rassi, Salvador; Freitas, Elis Marra da Madeira; Gutierrez, Natália da Silva; Boaventura, Fabiana Miranda; Sampaio, Larissa Pereira da Costa; Silva, João Bastista Masson

    2017-01-01

    Background Prognostic factors are extensively studied in heart failure; however, their role in severe Chagasic heart failure have not been established. Objectives To identify the association of clinical and laboratory factors with the prognosis of severe Chagasic heart failure, as well as the association of these factors with mortality and survival in a 7.5-year follow-up. Methods 60 patients with severe Chagasic heart failure were evaluated regarding the following variables: age, blood pressure, ejection fraction, serum sodium, creatinine, 6-minute walk test, non-sustained ventricular tachycardia, QRS width, indexed left atrial volume, and functional class. Results 53 (88.3%) patients died during follow-up, and 7 (11.7%) remained alive. Cumulative overall survival probability was approximately 11%. Non-sustained ventricular tachycardia (HR = 2.11; 95% CI: 1.04 - 4.31; p<0.05) and indexed left atrial volume ≥ 72 mL/m2 (HR = 3.51; 95% CI: 1.63 - 7.52; p<0.05) were the only variables that remained as independent predictors of mortality. Conclusions The presence of non-sustained ventricular tachycardia on Holter and indexed left atrial volume > 72 mL/m2 are independent predictors of mortality in severe Chagasic heart failure, with cumulative survival probability of only 11% in 7.5 years. PMID:28443956

  11. Pregnancy after tubal sterilization with silicone rubber band and spring clip application.

    PubMed

    Peterson, H B; Xia, Z; Wilcox, L S; Tylor, L R; Trussell, J

    2001-02-01

    To determine risk factors for pregnancy after tubal sterilization with silicone rubber bands or spring clips. A total of 3329 women sterilized using silicone rubber bands and 1595 women sterilized using spring clips were followed for up to 14 years as part of a prospective cohort study conducted in medical centers in nine US cities. We assessed the risk of pregnancy by cumulative life-table probabilities and proportional hazards analysis. The risk of pregnancy for women who had silicone rubber band application differed by location of band application and study site. The 10-year cumulative probabilities of pregnancy varied from a low of 0.0 per 1000 procedures at one study site to a high of 42.5 per 1000 procedures in the four combined sites in which fewer than 100 procedures per site were performed. The risk of pregnancy for women who had spring clip application varied by location of clip application, study site, race or ethnicity, tubal disease, and history of abdominal or pelvic surgery. The probabilities across study sites ranged from 7.1 per 1000 procedures at 10 years to 78.0 per 1000 procedures at 5 years (follow-up was limited to 5 years at that site). The 10-year cumulative probability of pregnancy after silicone rubber band and spring clip application is low but varies substantially by both clinical and demographic characteristics.

  12. Probabilistic Simulation of Progressive Fracture in Bolted-Joint Composite Laminates

    NASA Technical Reports Server (NTRS)

    Minnetyan, L.; Singhal, S. N.; Chamis, C. C.

    1996-01-01

    This report describes computational methods to probabilistically simulate fracture in bolted composite structures. An innovative approach that is independent of stress intensity factors and fracture toughness was used to simulate progressive fracture. The effect of design variable uncertainties on structural damage was also quantified. A fast probability integrator assessed the scatter in the composite structure response before and after damage. Then the sensitivity of the response to design variables was computed. General-purpose methods, which are applicable to bolted joints in all types of structures and in all fracture processes-from damage initiation to unstable propagation and global structure collapse-were used. These methods were demonstrated for a bolted joint of a polymer matrix composite panel under edge loads. The effects of the fabrication process were included in the simulation of damage in the bolted panel. Results showed that the most effective way to reduce end displacement at fracture is to control both the load and the ply thickness. The cumulative probability for longitudinal stress in all plies was most sensitive to the load; in the 0 deg. plies it was very sensitive to ply thickness. The cumulative probability for transverse stress was most sensitive to the matrix coefficient of thermal expansion. In addition, fiber volume ratio and fiber transverse modulus both contributed significantly to the cumulative probability for the transverse stresses in all the plies.

  13. Crystal fractionation in the SNC meteorites: Implications for sample selection

    NASA Technical Reports Server (NTRS)

    Treiman, Allan H.

    1988-01-01

    Almost all rock types in the SNC meteorites are cumulates, products of magma differentiation by crystal fractionation (addition or removal of crystals). If the SNC meteorites are from the surface of Mars or near subsurface, then most of the igneous units on Mars are differentiated. Basaltic units probably experienced minor to moderate differientation, but ultrabasic units probably experienced extreme differentiation. Products of this differentiation may include Fe-rich gabbro, pyroxenite, periodotite (and thus serpentine), and possibly massive sulfides. The SNC meteorites include ten lithologies (three in EETA79001), eight of which are crystal cumulates. The other lithologies, EETA79001 A and B are subophitic basalts.

  14. A full-angle Monte-Carlo scattering technique including cumulative and single-event Rutherford scattering in plasmas [Theory of cumulative large-angle collisions in plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Higginson, Drew P.

    Here, we describe and justify a full-angle scattering (FAS) method to faithfully reproduce the accumulated differential angular Rutherford scattering probability distribution function (pdf) of particles in a plasma. The FAS method splits the scattering events into two regions. At small angles it is described by cumulative scattering events resulting, via the central limit theorem, in a Gaussian-like pdf; at larger angles it is described by single-event scatters and retains a pdf that follows the form of the Rutherford differential cross-section. The FAS method is verified using discrete Monte-Carlo scattering simulations run at small timesteps to include each individual scattering event.more » We identify the FAS regime of interest as where the ratio of temporal/spatial scale-of-interest to slowing-down time/length is from 10 -3 to 0.3–0.7; the upper limit corresponds to Coulomb logarithm of 20–2, respectively. Two test problems, high-velocity interpenetrating plasma flows and keV-temperature ion equilibration, are used to highlight systems where including FAS is important to capture relevant physics.« less

  15. Larval Pacific herring, Clupea pallasii (Valenciennes), are highly susceptible to viral haemorrhagic septicaemia and survivors are partially protected after their metamorphosis to juveniles

    USGS Publications Warehouse

    Hershberger, P.K.; Gregg, J.; Pacheco, C.; Winton, J.; Richard, J.; Traxler, G.

    2007-01-01

    Pacific herring were susceptible to waterborne challenge with viral haemorrhagic septicaemia virus (VHSV) throughout their early life history stages, with significantly greater cumulative mortalities occurring among VHSV-exposed groups of 9-, 44-, 54- and 76-day-old larvae than among respective control groups. Similarly, among 89-day-1-year-old and 1+year old post-metamorphosed juveniles, cumulative mortality was significantly greater in VHSV-challenged groups than in respective control groups. Larval exposure to VHSV conferred partial protection to the survivors after their metamorphosis to juveniles as shown by significantly less cumulative mortalities among juvenile groups that survived a VHS epidemic as larvae than among groups that were previously nai??ve to VHSV. Magnitude of the protection, measured as relative per cent survival, was a direct function of larval age at first exposure and was probably a reflection of gradual developmental onset of immunocompetence. These results indicate the potential for easily overlooked VHS epizootics among wild larvae in regions where the virus is endemic and emphasize the importance of early life history stages of marine fish in influencing the ecological disease processes. ?? 2007 The Authors.

  16. A full-angle Monte-Carlo scattering technique including cumulative and single-event Rutherford scattering in plasmas [Theory of cumulative large-angle collisions in plasmas

    DOE PAGES

    Higginson, Drew P.

    2017-08-12

    Here, we describe and justify a full-angle scattering (FAS) method to faithfully reproduce the accumulated differential angular Rutherford scattering probability distribution function (pdf) of particles in a plasma. The FAS method splits the scattering events into two regions. At small angles it is described by cumulative scattering events resulting, via the central limit theorem, in a Gaussian-like pdf; at larger angles it is described by single-event scatters and retains a pdf that follows the form of the Rutherford differential cross-section. The FAS method is verified using discrete Monte-Carlo scattering simulations run at small timesteps to include each individual scattering event.more » We identify the FAS regime of interest as where the ratio of temporal/spatial scale-of-interest to slowing-down time/length is from 10 -3 to 0.3–0.7; the upper limit corresponds to Coulomb logarithm of 20–2, respectively. Two test problems, high-velocity interpenetrating plasma flows and keV-temperature ion equilibration, are used to highlight systems where including FAS is important to capture relevant physics.« less

  17. Application of Probabilistic Methods for the Determination of an Economically Robust HSCT Configuration

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.; Bandte, Oliver; Schrage, Daniel P.

    1996-01-01

    This paper outlines an approach for the determination of economically viable robust design solutions using the High Speed Civil Transport (HSCT) as a case study. Furthermore, the paper states the advantages of a probability based aircraft design over the traditional point design approach. It also proposes a new methodology called Robust Design Simulation (RDS) which treats customer satisfaction as the ultimate design objective. RDS is based on a probabilistic approach to aerospace systems design, which views the chosen objective as a distribution function introduced by so called noise or uncertainty variables. Since the designer has no control over these variables, a variability distribution is defined for each one of them. The cumulative effect of all these distributions causes the overall variability of the objective function. For cases where the selected objective function depends heavily on these noise variables, it may be desirable to obtain a design solution that minimizes this dependence. The paper outlines a step by step approach on how to achieve such a solution for the HSCT case study and introduces an evaluation criterion which guarantees the highest customer satisfaction. This customer satisfaction is expressed by the probability of achieving objective function values less than a desired target value.

  18. New paradoxes of risky decision making.

    PubMed

    Birnbaum, Michael H

    2008-04-01

    During the last 25 years, prospect theory and its successor, cumulative prospect theory, replaced expected utility as the dominant descriptive theories of risky decision making. Although these models account for the original Allais paradoxes, 11 new paradoxes show where prospect theories lead to self-contradiction or systematic false predictions. The new findings are consistent with and, in several cases, were predicted in advance by simple "configural weight" models in which probability-consequence branches are weighted by a function that depends on branch probability and ranks of consequences on discrete branches. Although they have some similarities to later models called "rank-dependent utility," configural weight models do not satisfy coalescing, the assumption that branches leading to the same consequence can be combined by adding their probabilities. Nor do they satisfy cancellation, the "independence" assumption that branches common to both alternatives can be removed. The transfer of attention exchange model, with parameters estimated from previous data, correctly predicts results with all 11 new paradoxes. Apparently, people do not frame choices as prospects but, instead, as trees with branches.

  19. Generalized Arcsine Laws for Fractional Brownian Motion.

    PubMed

    Sadhu, Tridib; Delorme, Mathieu; Wiese, Kay Jörg

    2018-01-26

    The three arcsine laws for Brownian motion are a cornerstone of extreme-value statistics. For a Brownian B_{t} starting from the origin, and evolving during time T, one considers the following three observables: (i) the duration t_{+} the process is positive, (ii) the time t_{last} the process last visits the origin, and (iii) the time t_{max} when it achieves its maximum (or minimum). All three observables have the same cumulative probability distribution expressed as an arcsine function, thus the name arcsine laws. We show how these laws change for fractional Brownian motion X_{t}, a non-Markovian Gaussian process indexed by the Hurst exponent H. It generalizes standard Brownian motion (i.e., H=1/2). We obtain the three probabilities using a perturbative expansion in ϵ=H-1/2. While all three probabilities are different, this distinction can only be made at second order in ϵ. Our results are confirmed to high precision by extensive numerical simulations.

  20. Multi-beam transmitter geometries for free-space optical communications

    NASA Astrophysics Data System (ADS)

    Tellez, Jason A.; Schmidt, Jason D.

    2010-02-01

    Free-space optical communications systems provide the opportunity to take advantage of higher data transfer rates and lower probability of intercept compared to radio-frequency communications. However, propagation through atmospheric turbulence, such as for airborne laser communication over long paths, results in intensity variations at the receiver and a corresponding degradation in bit error rate (BER) performance. Previous literature has shown that two transmitters, when separated sufficiently, can effectively average out the intensity varying effects of the atmospheric turbulence at the receiver. This research explores the impacts of adding more transmitters and the marginal reduction in the probability of signal fades while minimizing the overall transmitter footprint, an important design factor when considering an airborne communications system. Analytical results for the cumulative distribution function are obtained for tilt-only results, while wave-optics simulations are used to simulate the effects of scintillation. These models show that the probability of signal fade is reduced as the number of transmitters is increased.

  1. Experiences of Intimate Partner and Neighborhood Violence and Their Association With Mental Health in Pregnant Women.

    PubMed

    Barcelona de Mendoza, Veronica; Harville, Emily W; Savage, Jane; Giarratano, Gloria

    2018-03-01

    Both intimate partner violence and neighborhood crime have been associated with worse mental health outcomes, but less is known about cumulative effects. This association was studied in a sample of pregnant women who were enrolled in a study of disaster exposure, prenatal care, and mental and physical health outcomes between 2010 and 2012. Women were interviewed about their exposure to intimate partner violence and perceptions of neighborhood safety, crime, and disorder. Main study outcomes included symptoms of poor mental health; including depression, pregnancy-specific anxiety (PA), and posttraumatic stress disorder (PTSD). Logistic regression was used to examine predictors of mental health with adjustment for confounders. Women who experienced high levels of intimate partner violence and perceived neighborhood violence had increased odds of probable depression in individual models. Weighted high cumulative (intimate partner and neighborhood) experiences of violence were also associated with increased odds of having probable depression when compared with those with low violence. Weighed high cumulative violence was also associated with increased odds of PTSD. This study provides additional evidence that cumulative exposure to violence is associated with poorer mental health in pregnant women.

  2. Structural bias in the sentencing of felony defendants.

    PubMed

    Sutton, John R

    2013-09-01

    As incarceration rates have risen in the US, so has the overrepresentation of African Americans and Latinos among prison inmates. Whether and to what degree these disparities are due to bias in the criminal courts remains a contentious issue. This article pursues two lines of argument toward a structural account of bias in the criminal law, focusing on (1) cumulative disadvantages that may accrue over successive stages of the criminal justice process, and (2) the contexts of racial disadvantage in which courts are embedded. These arguments are tested using case-level data on male defendants charged with felony crimes in urban US counties in 2000. Multilevel binary and ordinal logit models are used to estimate contextual effects on pretrial detention, guilty pleas, and sentence severity, and cumulative effects are estimated as conditional probabilities that are allowed to vary by race across all three outcomes. Results yield strong, but qualified, evidence of cumulative disadvantage accruing to black and Latino defendants, but do not support the contextual hypotheses. When the cumulative effects of bias are taken into account, the estimated probability of the average African American or Latino felon going to prison is 26% higher than that of the average Anglo. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Distributed Immune Systems for Wireless Network Information Assurance

    DTIC Science & Technology

    2010-04-26

    ratio test (SPRT), where the goal is to optimize a hypothesis testing problem given a trade-off between the probability of errors and the...using cumulative sum (CUSUM) and Girshik-Rubin-Shiryaev (GRSh) statistics. In sequential versions of the problem the sequential probability ratio ...the more complicated problems, in particular those where no clear mean can be established. We developed algorithms based on the sequential probability

  4. N -tag probability law of the symmetric exclusion process

    NASA Astrophysics Data System (ADS)

    Poncet, Alexis; Bénichou, Olivier; Démery, Vincent; Oshanin, Gleb

    2018-06-01

    The symmetric exclusion process (SEP), in which particles hop symmetrically on a discrete line with hard-core constraints, is a paradigmatic model of subdiffusion in confined systems. This anomalous behavior is a direct consequence of strong spatial correlations induced by the requirement that the particles cannot overtake each other. Even if this fact has been recognized qualitatively for a long time, up to now there has been no full quantitative determination of these correlations. Here we study the joint probability distribution of an arbitrary number of tagged particles in the SEP. We determine analytically its large-time limit for an arbitrary density of particles, and its full dynamics in the high-density limit. In this limit, we obtain the time-dependent large deviation function of the problem and unveil a universal scaling form shared by the cumulants.

  5. Prospect theory on the brain? Toward a cognitive neuroscience of decision under risk.

    PubMed

    Trepel, Christopher; Fox, Craig R; Poldrack, Russell A

    2005-04-01

    Most decisions must be made without advance knowledge of their consequences. Economists and psychologists have devoted much attention to modeling decisions made under conditions of risk in which options can be characterized by a known probability distribution over possible outcomes. The descriptive shortcomings of classical economic models motivated the development of prospect theory (D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica, 4 (1979) 263-291; A. Tversky, D. Kahneman, Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4) (1992) 297-323) the most successful behavioral model of decision under risk. In the prospect theory, subjective value is modeled by a value function that is concave for gains, convex for losses, and steeper for losses than for gains; the impact of probabilities are characterized by a weighting function that overweights low probabilities and underweights moderate to high probabilities. We outline the possible neural bases of the components of prospect theory, surveying evidence from human imaging, lesion, and neuropharmacology studies as well as animal neurophysiology studies. These results provide preliminary suggestions concerning the neural bases of prospect theory that include a broad set of brain regions and neuromodulatory systems. These data suggest that focused studies of decision making in the context of quantitative models may provide substantial leverage towards a fuller understanding of the cognitive neuroscience of decision making.

  6. Multiple imputation methods for nonparametric inference on cumulative incidence with missing cause of failure

    PubMed Central

    Lee, Minjung; Dignam, James J.; Han, Junhee

    2014-01-01

    We propose a nonparametric approach for cumulative incidence estimation when causes of failure are unknown or missing for some subjects. Under the missing at random assumption, we estimate the cumulative incidence function using multiple imputation methods. We develop asymptotic theory for the cumulative incidence estimators obtained from multiple imputation methods. We also discuss how to construct confidence intervals for the cumulative incidence function and perform a test for comparing the cumulative incidence functions in two samples with missing cause of failure. Through simulation studies, we show that the proposed methods perform well. The methods are illustrated with data from a randomized clinical trial in early stage breast cancer. PMID:25043107

  7. Cost-Utility of Stepped Care Targeting Psychological Distress in Patients With Head and Neck or Lung Cancer.

    PubMed

    Jansen, Femke; Krebber, Anna M H; Coupé, Veerle M H; Cuijpers, Pim; de Bree, Remco; Becker-Commissaris, Annemarie; Smit, Egbert F; van Straten, Annemieke; Eeckhout, Guus M; Beekman, Aartjan T F; Leemans, C René; Verdonck-de Leeuw, Irma M

    2017-01-20

    Purpose A stepped care (SC) program in which an effective yet least resource-intensive treatment is delivered to patients first and followed, when necessary, by more resource-intensive treatments was found to be effective in improving distress levels of patients with head and neck cancer or lung cancer. Information on the value of this program for its cost is now called for. Therefore, this study aimed to assess the cost-utility of the SC program compared with care-as-usual (CAU) in patients with head and neck cancer or lung cancer who have psychological distress. Patients and Methods In total, 156 patients were randomly assigned to SC or CAU. Intervention costs, direct medical costs, direct nonmedical costs, productivity losses, and health-related quality-of-life data during the intervention or control period and 12 months of follow-up were calculated by using Trimbos and Institute of Medical Technology Assessment Cost Questionnaire for Psychiatry, Productivity and Disease Questionnaire, and EuroQol-5 Dimension measures and data from the hospital information system. The SC program's value for the cost was investigated by comparing mean cumulative costs and quality-adjusted life years (QALYs). Results After imputation of missing data, mean cumulative costs were -€3,950 (95% CI, -€8,158 to -€190) lower, and mean number of QALYs was 0.116 (95% CI, 0.005 to 0.227) higher in the intervention group compared with the control group. The intervention group had a probability of 96% that cumulative QALYs were higher and cumulative costs were lower than in the control group. Four additional analyses were conducted to assess the robustness of this finding, and they found that the intervention group had a probability of 84% to 98% that cumulative QALYs were higher and a probability of 91% to 99% that costs were lower than in the control group. Conclusion SC is highly likely to be cost-effective; the number of QALYs was higher and cumulative costs were lower for SC compared with CAU.

  8. Finite-size scaling for discontinuous nonequilibrium phase transitions

    NASA Astrophysics Data System (ADS)

    de Oliveira, Marcelo M.; da Luz, M. G. E.; Fiore, Carlos E.

    2018-06-01

    A finite-size scaling theory, originally developed only for transitions to absorbing states [Phys. Rev. E 92, 062126 (2015), 10.1103/PhysRevE.92.062126], is extended to distinct sorts of discontinuous nonequilibrium phase transitions. Expressions for quantities such as response functions, reduced cumulants, and equal area probability distributions are derived from phenomenological arguments. Irrespective of system details, all these quantities scale with the volume, establishing the dependence on size. The approach generality is illustrated through the analysis of different models. The present results are a relevant step in trying to unify the scaling behavior description of nonequilibrium transition processes.

  9. Setting cumulative emissions targets to reduce the risk of dangerous climate change

    PubMed Central

    Zickfeld, Kirsten; Eby, Michael; Matthews, H. Damon; Weaver, Andrew J.

    2009-01-01

    Avoiding “dangerous anthropogenic interference with the climate system” requires stabilization of atmospheric greenhouse gas concentrations and substantial reductions in anthropogenic emissions. Here, we present an inverse approach to coupled climate-carbon cycle modeling, which allows us to estimate the probability that any given level of carbon dioxide (CO2) emissions will exceed specified long-term global mean temperature targets for “dangerous anthropogenic interference,” taking into consideration uncertainties in climate sensitivity and the carbon cycle response to climate change. We show that to stabilize global mean temperature increase at 2 °C above preindustrial levels with a probability of at least 0.66, cumulative CO2 emissions from 2000 to 2500 must not exceed a median estimate of 590 petagrams of carbon (PgC) (range, 200 to 950 PgC). If the 2 °C temperature stabilization target is to be met with a probability of at least 0.9, median total allowable CO2 emissions are 170 PgC (range, −220 to 700 PgC). Furthermore, these estimates of cumulative CO2 emissions, compatible with a specified temperature stabilization target, are independent of the path taken to stabilization. Our analysis therefore supports an international policy framework aimed at avoiding dangerous anthropogenic interference formulated on the basis of total allowable greenhouse gas emissions. PMID:19706489

  10. Setting cumulative emissions targets to reduce the risk of dangerous climate change.

    PubMed

    Zickfeld, Kirsten; Eby, Michael; Matthews, H Damon; Weaver, Andrew J

    2009-09-22

    Avoiding "dangerous anthropogenic interference with the climate system" requires stabilization of atmospheric greenhouse gas concentrations and substantial reductions in anthropogenic emissions. Here, we present an inverse approach to coupled climate-carbon cycle modeling, which allows us to estimate the probability that any given level of carbon dioxide (CO2) emissions will exceed specified long-term global mean temperature targets for "dangerous anthropogenic interference," taking into consideration uncertainties in climate sensitivity and the carbon cycle response to climate change. We show that to stabilize global mean temperature increase at 2 degrees C above preindustrial levels with a probability of at least 0.66, cumulative CO2 emissions from 2000 to 2500 must not exceed a median estimate of 590 petagrams of carbon (PgC) (range, 200 to 950 PgC). If the 2 degrees C temperature stabilization target is to be met with a probability of at least 0.9, median total allowable CO2 emissions are 170 PgC (range, -220 to 700 PgC). Furthermore, these estimates of cumulative CO2 emissions, compatible with a specified temperature stabilization target, are independent of the path taken to stabilization. Our analysis therefore supports an international policy framework aimed at avoiding dangerous anthropogenic interference formulated on the basis of total allowable greenhouse gas emissions.

  11. Characterization of intermittency in zooplankton behaviour in turbulence.

    PubMed

    Michalec, François-Gaël; Schmitt, François G; Souissi, Sami; Holzner, Markus

    2015-10-01

    We consider Lagrangian velocity differences of zooplankters swimming in still water and in turbulence. Using cumulants, we quantify the intermittency properties of their motion recorded using three-dimensional particle tracking velocimetry. Copepods swimming in still water display an intermittent behaviour characterized by a high probability of small velocity increments, and by stretched exponential tails. Low values arise from their steady cruising behaviour while heavy tails result from frequent relocation jumps. In turbulence, we show that at short time scales, the intermittency signature of active copepods clearly differs from that of the underlying flow, and reflects the frequent relocation jumps displayed by these small animals. Despite these differences, we show that copepods swimming in still and turbulent flow belong to the same intermittency class that can be modelled by a log-stable model with non-analytical cumulant generating function. Intermittency in swimming behaviour and relocation jumps may enable copepods to display oriented, collective motion under strong hydrodynamic conditions and thus, may contribute to the formation of zooplankton patches in energetic environments.

  12. Reoccupation of floodplains by rivers and its relation to the age structure of floodplain vegetation

    USGS Publications Warehouse

    Konrad, Christopher P.

    2012-01-01

    River channel dynamics over many decades provide a physical control on the age structure of floodplain vegetation as a river occupies and abandons locations. Floodplain reoccupation by a river, in particular, determines the interval of time during which vegetation can establish and mature. A general framework for analyzing floodplain reoccupation and a time series model are developed and applied to five alluvial rivers in the United States. Channel dynamics in these rivers demonstrate time-scale dependence with short-term oscillation in active channel area in response to floods and subsequent vegetation growth and progressive lateral movement that accounts for much of the cumulative area occupied by the rivers over decades. Rivers preferentially reoccupy locations recently abandoned causing a decreasing probability of reoccupation with time since abandonment. For a typical case, a river is 10 times more likely to reoccupy an area it abandoned in the past decade than it is to reoccupy an area it abandoned 30 yrs ago. The decreasing probability of reoccupation over time is consistent with observations of persistent stands of late seral stage floodplain forest. A power function provides a robust approach for estimating the cumulative area occupied by a river and the age structure of riparian forests resulting from a specific historical sequence of streamflow in comparison to either linear or exponential alternatives.

  13. High School Employment, School Performance, and College Entry

    ERIC Educational Resources Information Center

    Lee, Chanyoung; Orazem, Peter F.

    2010-01-01

    The proportion of U.S. high school students working during the school year ranges from 23% in the freshman year to 75% in the senior year. This study estimates how cumulative work histories during the high school years affect probability of dropout, high school academic performance, and the probability of attending college. Variations in…

  14. Description of quasiparticle and satellite properties via cumulant expansions of the retarded one-particle Green's function

    DOE PAGES

    Mayers, Matthew Z.; Hybertsen, Mark S.; Reichman, David R.

    2016-08-22

    A cumulant-based GW approximation for the retarded one-particle Green's function is proposed, motivated by an exact relation between the improper Dyson self-energy and the cumulant generating function. We explore qualitative aspects of this method within a simple one-electron independent phonon model, where it is seen that the method preserves the energy moment of the spectral weight while also reproducing the exact Green's function in the weak-coupling limit. For the three-dimensional electron gas, this method predicts multiple satellites at the bottom of the band, albeit with inaccurate peak spacing. But, its quasiparticle properties and correlation energies are more accurate than bothmore » previous cumulant methods and standard G0W0. These results point to features that may be exploited within the framework of cumulant-based methods and suggest promising directions for future exploration and improvements of cumulant-based GW approaches.« less

  15. A statistical study of gyro-averaging effects in a reduced model of drift-wave transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonseca, Julio; Del-Castillo-Negrete, Diego B.; Sokolov, Igor M.

    2016-08-25

    Here, a statistical study of finite Larmor radius (FLR) effects on transport driven by electrostatic driftwaves is presented. The study is based on a reduced discrete Hamiltonian dynamical system known as the gyro-averaged standard map (GSM). In this system, FLR effects are incorporated through the gyro-averaging of a simplified weak-turbulence model of electrostatic fluctuations. Formally, the GSM is a modified version of the standard map in which the perturbation amplitude, K 0, becomes K 0J 0(more » $$\\hat{p}$$), where J 0 is the zeroth-order Bessel function and $$\\hat{p}$$ s the Larmor radius. Assuming a Maxwellian probability density function (pdf) for $$\\hat{p}$$ , we compute analytically and numerically the pdf and the cumulative distribution function of the effective drift-wave perturba- tion amplitude K 0J 0($$\\hat{p}$$). Using these results, we compute the probability of loss of confinement (i.e., global chaos), P c provides an upper bound for the escape rate, and that P t rovides a good estimate of the particle trapping rate. Lastly. the analytical results are compared with direct numerical Monte-Carlo simulations of particle transport.« less

  16. Probability of Loss of Assured Safety in Systems with Multiple Time-Dependent Failure Modes: Incorporation of Delayed Link Failure in the Presence of Aleatory Uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helton, Jon C.; Brooks, Dusty Marie; Sallaberry, Cedric Jean-Marie.

    Probability of loss of assured safety (PLOAS) is modeled for weak link (WL)/strong link (SL) systems in which one or more WLs or SLs could potentially degrade into a precursor condition to link failure that will be followed by an actual failure after some amount of elapsed time. The following topics are considered: (i) Definition of precursor occurrence time cumulative distribution functions (CDFs) for individual WLs and SLs, (ii) Formal representation of PLOAS with constant delay times, (iii) Approximation and illustration of PLOAS with constant delay times, (iv) Formal representation of PLOAS with aleatory uncertainty in delay times, (v) Approximationmore » and illustration of PLOAS with aleatory uncertainty in delay times, (vi) Formal representation of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, (vii) Approximation and illustration of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, and (viii) Procedures for the verification of PLOAS calculations for the three indicated definitions of delayed link failure.« less

  17. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual, appendix 2

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    The FORTRAN programs RANDOM3 and RANDOM4 are documented. They are based on fatigue strength reduction, using a probabilistic constitutive model. They predict the random lifetime of an engine component to reach a given fatigue strength. Included in this user manual are details regarding the theoretical backgrounds of RANDOM3 and RANDOM4. Appendix A gives information on the physical quantities, their symbols, FORTRAN names, and both SI and U.S. Customary units. Appendix B and C include photocopies of the actual computer printout corresponding to the sample problems. Appendices D and E detail the IMSL, Version 10(1), subroutines and functions called by RANDOM3 and RANDOM4 and SAS/GRAPH(2) programs that can be used to plot both the probability density functions (p.d.f.) and the cumulative distribution functions (c.d.f.).

  18. Childhood Cumulative Risk Exposure and Adult Amygdala Volume and Function

    PubMed Central

    Evans, Gary W.; Swain, James E.; King, Anthony P.; Wang, Xin; Javanbakht, Arash; Ho, S. Shaun; Angstadt, Michael; Phan, K. Luan; Xie, Hong; Liberzon, Israel

    2015-01-01

    Considerable work indicates that early cumulative risk exposure is aversive to human development, but very little research has examined neurological underpinnings of these robust findings. We investigated amygdala volume and reactivity to facial stimuli among adults (M = 23.7 years, n = 54) as a function of cumulative risk exposure during childhood (ages 9 and 13). In addition, we tested whether expected, cumulative risk elevations in amygdala volume would mediate functional reactivity of the amygdala during socio-emotional processing. Risks included substandard housing quality, noise, crowding, family turmoil, child separation from family, and violence. Total and left hemisphere adult amygdala volumes, respectively were positively related to cumulative risk exposure during childhood. The links between childhood cumulative risk exposure and elevated amygdala responses to emotionally neutral facial stimuli in adulthood were mediated by the respective amygdala volumes. Cumulative risk exposure in later adolescence (17 years), however, was unrelated to subsequent, adult amygdala volume or function. Physical and socioemotional risk exposures early in life appear to alter amygdala development, rendering adults more reactive to ambiguous stimuli such as neutral faces. These stress-related differences in childhood amygdala development might contribute to well-documented psychological distress as a function of early risk exposure. PMID:26469872

  19. Dynamic Response of an Optomechanical System to a Stationary Random Excitation in the Time Domain

    DOE PAGES

    Palmer, Jeremy A.; Paez, Thomas L.

    2011-01-01

    Modern electro-optical instruments are typically designed with assemblies of optomechanical members that support optics such that alignment is maintained in service environments that include random vibration loads. This paper presents a nonlinear numerical analysis that calculates statistics for the peak lateral response of optics in an optomechanical sub-assembly subject to random excitation of the housing. The work is unique in that the prior art does not address peak response probability distribution for stationary random vibration in the time domain for a common lens-retainer-housing system with Coulomb damping. Analytical results are validated by using displacement response data from random vibration testingmore » of representative prototype sub-assemblies. A comparison of predictions to experimental results yields reasonable agreement. The Type I Asymptotic form provides the cumulative distribution function for peak response probabilities. Probabilities are calculated for actual lens centration tolerances. The probability that peak response will not exceed the centration tolerance is greater than 80% for prototype configurations where the tolerance is high (on the order of 30 micrometers). Conversely, the probability is low for those where the tolerance is less than 20 micrometers. The analysis suggests a design paradigm based on the influence of lateral stiffness on the magnitude of the response.« less

  20. Performance of multi-hop parallel free-space optical communication over gamma-gamma fading channel with pointing errors.

    PubMed

    Gao, Zhengguang; Liu, Hongzhan; Ma, Xiaoping; Lu, Wei

    2016-11-10

    Multi-hop parallel relaying is considered in a free-space optical (FSO) communication system deploying binary phase-shift keying (BPSK) modulation under the combined effects of a gamma-gamma (GG) distribution and misalignment fading. Based on the best path selection criterion, the cumulative distribution function (CDF) of this cooperative random variable is derived. Then the performance of this optical mesh network is analyzed in detail. A Monte Carlo simulation is also conducted to demonstrate the effectiveness of the results for the average bit error rate (ABER) and outage probability. The numerical result proves that it needs a smaller average transmitted optical power to achieve the same ABER and outage probability when using the multi-hop parallel network in FSO links. Furthermore, the system use of more number of hops and cooperative paths can improve the quality of the communication.

  1. Prospect theory in the health domain: a quantitative assessment.

    PubMed

    Attema, Arthur E; Brouwer, Werner B F; I'Haridon, Olivier

    2013-12-01

    It is well-known that expected utility (EU) has empirical deficiencies. Cumulative prospect theory (CPT) has developed as an alternative with more descriptive validity. However, CPT's full function had not yet been quantified in the health domain. This paper is therefore the first to simultaneously measure utility of life duration, probability weighting, and loss aversion in this domain. We observe loss aversion and risk aversion for gains and losses, which for gains can be explained by probabilistic pessimism. Utility for gains is almost linear. For losses, we find less weighting of probability 1/2 and concave utility. This contrasts with the common finding of convex utility for monetary losses. However, CPT was proposed to explain choices among lotteries involving monetary outcomes. Life years are arguably very different from monetary outcomes and need not generate convex utility for losses. Moreover, utility of life duration reflects discounting, causing concave utility. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Rapidity dependence of proton cumulants and correlation functions

    DOE PAGES

    Bzdak, Adam; Koch, Volker

    2017-11-13

    The dependence of multiproton correlation functions and cumulants on the acceptance in rapidity and transverse momentum is studied. Here, we found that the preliminary data of various cumulant ratios are consistent, within errors, with rapidity and transverse momentum-independent correlation functions. But, rapidity correlations which moderately increase with rapidity separation between protons are slightly favored. We propose to further explore the rapidity dependence of multiparticle correlation functions by measuring the dependence of the integrated reduced correlation functions as a function of the size of the rapidity window.

  3. Rapidity dependence of proton cumulants and correlation functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bzdak, Adam; Koch, Volker

    The dependence of multiproton correlation functions and cumulants on the acceptance in rapidity and transverse momentum is studied. Here, we found that the preliminary data of various cumulant ratios are consistent, within errors, with rapidity and transverse momentum-independent correlation functions. But, rapidity correlations which moderately increase with rapidity separation between protons are slightly favored. We propose to further explore the rapidity dependence of multiparticle correlation functions by measuring the dependence of the integrated reduced correlation functions as a function of the size of the rapidity window.

  4. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  5. Estimating the burden of recurrent events in the presence of competing risks: the method of mean cumulative count.

    PubMed

    Dong, Huiru; Robison, Leslie L; Leisenring, Wendy M; Martin, Leah J; Armstrong, Gregory T; Yasui, Yutaka

    2015-04-01

    Cumulative incidence has been widely used to estimate the cumulative probability of developing an event of interest by a given time, in the presence of competing risks. When it is of interest to measure the total burden of recurrent events in a population, however, the cumulative incidence method is not appropriate because it considers only the first occurrence of the event of interest for each individual in the analysis: Subsequent occurrences are not included. Here, we discuss a straightforward and intuitive method termed "mean cumulative count," which reflects a summarization of all events that occur in the population by a given time, not just the first event for each subject. We explore the mathematical relationship between mean cumulative count and cumulative incidence. Detailed calculation of mean cumulative count is described by using a simple hypothetical example, and the computation code with an illustrative example is provided. Using follow-up data from January 1975 to August 2009 collected in the Childhood Cancer Survivor Study, we show applications of mean cumulative count and cumulative incidence for the outcome of subsequent neoplasms to demonstrate different but complementary information obtained from the 2 approaches and the specific utility of the former. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Interval Estimation of Seismic Hazard Parameters

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, Beata; Lasocki, Stanislaw

    2017-03-01

    The paper considers Poisson temporal occurrence of earthquakes and presents a way to integrate uncertainties of the estimates of mean activity rate and magnitude cumulative distribution function in the interval estimation of the most widely used seismic hazard functions, such as the exceedance probability and the mean return period. The proposed algorithm can be used either when the Gutenberg-Richter model of magnitude distribution is accepted or when the nonparametric estimation is in use. When the Gutenberg-Richter model of magnitude distribution is used the interval estimation of its parameters is based on the asymptotic normality of the maximum likelihood estimator. When the nonparametric kernel estimation of magnitude distribution is used, we propose the iterated bias corrected and accelerated method for interval estimation based on the smoothed bootstrap and second-order bootstrap samples. The changes resulted from the integrated approach in the interval estimation of the seismic hazard functions with respect to the approach, which neglects the uncertainty of the mean activity rate estimates have been studied using Monte Carlo simulations and two real dataset examples. The results indicate that the uncertainty of mean activity rate affects significantly the interval estimates of hazard functions only when the product of activity rate and the time period, for which the hazard is estimated, is no more than 5.0. When this product becomes greater than 5.0, the impact of the uncertainty of cumulative distribution function of magnitude dominates the impact of the uncertainty of mean activity rate in the aggregated uncertainty of the hazard functions. Following, the interval estimates with and without inclusion of the uncertainty of mean activity rate converge. The presented algorithm is generic and can be applied also to capture the propagation of uncertainty of estimates, which are parameters of a multiparameter function, onto this function.

  7. Work Disability among Women: The Role of Divorce in a Retrospective Cohort Study.

    PubMed

    Tamborini, Christopher R; Reznik, Gayle L; Couch, Kenneth A

    2016-03-01

    We assess how divorce through midlife affects the subsequent probability of work-limiting health among U.S. women. Using retrospective marital and work disability histories from the Survey of Income and Program Participation matched to Social Security earnings records, we identify women whose first marriage dissolved between 1975 and 1984 (n = 1,214) and women who remain continuously married (n = 3,394). Probit and propensity score matching models examine the cumulative probability of a work disability over a 20-year follow-up period. We find that divorce is associated with a significantly higher cumulative probability of a work disability, controlling for a range of factors. This association is strongest among divorced women who do not remarry. No consistent relationships are observed among divorced women who remarry and remained married. We find that economic hardship, work history, and selection into divorce influence, but do not substantially alter, the lasting impact of divorce on work-limiting health. © American Sociological Association 2016.

  8. Development of a European Ensemble System for Seasonal Prediction: Application to crop yield

    NASA Astrophysics Data System (ADS)

    Terres, J. M.; Cantelaube, P.

    2003-04-01

    Western European agriculture is highly intensive and the weather is the main source of uncertainty for crop yield assessment and for crop management. In the current system, at the time when a crop yield forecast is issued, the weather conditions leading up to harvest time are unknown and are therefore a major source of uncertainty. The use of seasonal weather forecast would bring additional information for the remaining crop season and has valuable benefit for improving the management of agricultural markets and environmentally sustainable farm practices. An innovative method for supplying seasonal forecast information to crop simulation models has been developed in the frame of the EU funded research project DEMETER. It consists in running a crop model on each individual member of the seasonal hindcasts to derive a probability distribution of crop yield. Preliminary results of cumulative probability function of wheat yield provides information on both the yield anomaly and the reliability of the forecast. Based on the spread of the probability distribution, the end-user can directly quantify the benefits and risks of taking weather-sensitive decisions.

  9. Childhood Cumulative Risk Exposure and Adult Amygdala Volume and Function.

    PubMed

    Evans, Gary W; Swain, James E; King, Anthony P; Wang, Xin; Javanbakht, Arash; Ho, S Shaun; Angstadt, Michael; Phan, K Luan; Xie, Hong; Liberzon, Israel

    2016-06-01

    Considerable work indicates that early cumulative risk exposure is aversive to human development, but very little research has examined the neurological underpinnings of these robust findings. This study investigates amygdala volume and reactivity to facial stimuli among adults (mean 23.7 years of age, n = 54) as a function of cumulative risk exposure during childhood (9 and 13 years of age). In addition, we test to determine whether expected cumulative risk elevations in amygdala volume would mediate functional reactivity of the amygdala during socioemotional processing. Risks included substandard housing quality, noise, crowding, family turmoil, child separation from family, and violence. Total and left hemisphere adult amygdala volumes were positively related to cumulative risk exposure during childhood. The links between childhood cumulative risk exposure and elevated amygdala responses to emotionally neutral facial stimuli in adulthood were mediated by the corresponding amygdala volumes. Cumulative risk exposure in later adolescence (17 years of age), however, was unrelated to subsequent adult amygdala volume or function. Physical and socioemotional risk exposures early in life appear to alter amygdala development, rendering adults more reactive to ambiguous stimuli such as neutral faces. These stress-related differences in childhood amygdala development might contribute to the well-documented psychological distress as a function of early risk exposure. © 2015 Wiley Periodicals, Inc.

  10. Variability of daily UV index in Jokioinen, Finland, in 1995-2015

    NASA Astrophysics Data System (ADS)

    Heikkilä, A.; Uusitalo, K.; Kärhä, P.; Vaskuri, A.; Lakkala, K.; Koskela, T.

    2017-02-01

    UV Index is a measure for UV radiation harmful for the human skin, developed and used to promote the sun awareness and protection of people. Monitoring programs conducted around the world have produced a number of long-term time series of UV irradiance. One of the longest time series of solar spectral UV irradiance in Europe has been obtained from the continuous measurements of Brewer #107 spectrophotometer in Jokioinen (lat. 60°44'N, lon. 23°30'E), Finland, over the years 1995-2015. We have used descriptive statistics and estimates of cumulative distribution functions, quantiles and probability density functions in the analysis of the time series of daily UV Index maxima. Seasonal differences in the estimated distributions and in the trends of the estimated quantiles are found.

  11. Emergency Department Youth Patients With Suicidal Ideation or Attempts: Predicting Suicide Attempts Through 18 Months of Follow-Up.

    PubMed

    Rosenbaum Asarnow, Joan; Berk, Michele; Zhang, Lily; Wang, Peter; Tang, Lingqi

    2017-10-01

    This prospective study of suicidal emergency department (ED) patients (ages 10-18) examined the timing, cumulative probability, and predictors of suicide attempts through 18 months of follow-up. The cumulative probability of attempts was as follows: .15 at 6 months, .22 at 1 year, and .24 by 18 months. One attempt was fatal, yielding a death rate of .006. Significant predictors of suicide attempt risk included a suicide attempt at ED presentation (vs. suicidal ideation only), nonsuicidal self-injurious behavior, and low levels of delinquent symptoms. Results underscore the importance of both prior suicide attempts and nonsuicidal self-harm as risk indicators for future and potentially lethal suicide attempts. © 2016 The American Association of Suicidology.

  12. Fatigue crack growth model RANDOM2 user manual, appendix 1

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    The FORTRAN program RANDOM2 is documented. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Included in this user manual are details regarding the theoretical background of RANDOM2, input data, instructions and a sample problem illustrating the use of RANDOM2. Appendix A gives information on the physical quantities, their symbols, FORTRAN names, and both SI and U.S. Customary units. Appendix B includes photocopies of the actual computer printout corresponding to the sample problem. Appendices C and D detail the IMSL, Ver. 10(1), subroutines and functions called by RANDOM2 and a SAS/GRAPH(2) program that can be used to plot both the probability density function (p.d.f.) and the cumulative distribution function (c.d.f.).

  13. Symmetry for the duration of entropy-consuming intervals.

    PubMed

    García-García, Reinaldo; Domínguez, Daniel

    2014-05-01

    We introduce the violation fraction υ as the cumulative fraction of time that a mesoscopic system spends consuming entropy at a single trajectory in phase space. We show that the fluctuations of this quantity are described in terms of a symmetry relation reminiscent of fluctuation theorems, which involve a function Φ, which can be interpreted as an entropy associated with the fluctuations of the violation fraction. The function Φ, when evaluated for arbitrary stochastic realizations of the violation fraction, is odd upon the symmetry transformations that are relevant for the associated stochastic entropy production. This fact leads to a detailed fluctuation theorem for the probability density function of Φ. We study the steady-state limit of this symmetry in the paradigmatic case of a colloidal particle dragged by optical tweezers through an aqueous solution. Finally, we briefly discuss possible applications of our results for the estimation of free-energy differences from single-molecule experiments.

  14. Music-evoked incidental happiness modulates probability weighting during risky lottery choices

    PubMed Central

    Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007

  15. Music-evoked incidental happiness modulates probability weighting during risky lottery choices.

    PubMed

    Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R

    2014-01-07

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.

  16. Probability and predictors of the cannabis gateway effect: A national study

    PubMed Central

    Secades-Villa, Roberto; Garcia-Rodríguez, Olaya; Jin, Chelsea, J.; Wang, Shuai; Blanco, Carlos

    2014-01-01

    Background While several studies have shown a high association between cannabis use and use of other illicit drugs, the predictors of progression from cannabis to other illicit drugs remain largely unknown. This study aims to estimate the cumulative probability of progression to illicit drug use among individuals with lifetime history of cannabis use, and to identify predictors of progression from cannabis use to other illicit drugs use. Methods Analyses were conducted on the sub-sample of participants in Wave 1of the National Epidemiological Survey on Alcohol and Related Conditions (NESARC) who started cannabis use before using any other drug (n= 6,624). Estimated projections of the cumulative probability of progression from cannabis use to use of any other illegal drug use in the general population were obtained by the standard actuarial method. Univariate and multivariable survival analyses with time-varying covariates were implemented to identify predictors of progression to any drug use. Results Lifetime cumulative probability estimates indicated that 44.7% of individuals with lifetime cannabis use progressed to other illicit drug use at some time in their lives. Several sociodemographic characteristics, internalizing and externalizing psychiatric disorders and indicators of substance use severity predicted progression from cannabis use to other illicit drugs use. Conclusion A large proportion of individuals who use cannabis go on to use other illegal drugs. The increased risk of progression from cannabis use to other illicit drugs use among individuals with mental disorders underscores the importance of considering the benefits and adverse effects of changes in cannabis regulations and of developing prevention and treatment strategies directed at curtailing cannabis use in these populations. PMID:25168081

  17. Assessing Stress-Related Treatment Needs among Girls at Risk for Poor Functional Outcomes: The Impact of Cumulative Adversity, Criterion Traumas, and Non-Criterion Events

    PubMed Central

    Lansing, Amy E.; Plante, Wendy Y.; Beck, Audrey N.

    2016-01-01

    Despite growing recognition that cumulative adversity (total stressor exposure), including complex trauma, increases the risk for psychopathology and impacts development, assessment strategies lag behind: Trauma-related mental health needs (symptoms, functional impairment, maladaptive coping) are typically assessed in response to only one qualifying Criterion-A event. This is especially problematic for youth at-risk for health and academic disparities who experience cumulative adversity, including non-qualifying events (parental separations) which may produce more impairing symptomatology. Data from 118 delinquent girls demonstrate: 1) an average of 14 adverse Criterion-A and non-Criterion event exposures; 2) serious maladaptive coping strategies (self-injury) directly in response to cumulative adversity; 3) more cumulative adversity-related than worst-event related symptomatology and functional impairment; and 4) comparable symptomatology, but greater functional impairment, in response to non-Criterion events. These data support the evaluation of mental health needs in response to cumulative adversity for optimal identification and tailoring of services in high-risk populations to reduce disparities. PMID:27745922

  18. Assessing stress-related treatment needs among girls at risk for poor functional outcomes: The impact of cumulative adversity, criterion traumas, and non-criterion events.

    PubMed

    Lansing, Amy E; Plante, Wendy Y; Beck, Audrey N

    2017-05-01

    Despite growing recognition that cumulative adversity (total stressor exposure, including complex trauma), increases the risk for psychopathology and impacts development, assessment strategies lag behind: Adversity-related mental health needs (symptoms, functional impairment, maladaptive coping) are typically assessed in response to only one qualifying Criterion-A traumatic event. This is especially problematic for youth at-risk for health and academic disparities who experience cumulative adversity, including non-qualifying events (separation from caregivers) which may produce more impairing symptomatology. Data from 118 delinquent girls demonstrate: (1) an average of 14 adverse Criterion-A and non-Criterion event exposures; (2) serious maladaptive coping strategies (self-injury) directly in response to cumulative adversity; (3) more cumulative adversity-related than worst-event related symptomatology and functional impairment; and (4) comparable symptomatology, but greater functional impairment, in response to non-Criterion events. These data support the evaluation of mental health needs in response to cumulative adversity for optimal identification and tailoring of services in high-risk populations to reduce disparities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Manifestation of quark clusters in the emission of cumulative protons in the experiment on the fragmentation of carbon ions

    NASA Astrophysics Data System (ADS)

    Abramov, B. M.; Alekseev, P. N.; Borodin, Yu. A.; Bulychjov, S. A.; Dukhovskoy, I. A.; Krutenkova, A. P.; Kulikov, V. V.; Martemyanov, M. A.; Matsyuk, M. A.; Turdakina, E. N.; Khanov, A. I.

    2013-06-01

    The proton yields at an angle of 3.5° have been measured in the FRAGM experiment on the fragmentation of carbon ions with the energies T 0 = 0.6, 0.95, and 2.0 GeV/nucleon on a beryllium target at the heavy-ion accelerator complex TWAC (terawatt accumulator, Institute for Theoretical and Experimental Physics). The data are represented in the form of the dependences of the invariant cross section for proton yield on the cumulative variable x in the range of 0.9 < x < 2.4. This invariant cross section varies within six orders of magnitude. The proton spectra have been analyzed within the theoretical approach of the fragmentation of quark clusters with the fragmentation functions obtained in the quark-gluon string model. The probabilities of the existence of six- and nine-quark clusters in the carbon nuclei are estimated as 8-12 and 0.2-0.6%, respectively. The results are compared to the estimated of quark effects obtained by other methods.

  20. A full-angle Monte-Carlo scattering technique including cumulative and single-event Rutherford scattering in plasmas

    NASA Astrophysics Data System (ADS)

    Higginson, Drew P.

    2017-11-01

    We describe and justify a full-angle scattering (FAS) method to faithfully reproduce the accumulated differential angular Rutherford scattering probability distribution function (pdf) of particles in a plasma. The FAS method splits the scattering events into two regions. At small angles it is described by cumulative scattering events resulting, via the central limit theorem, in a Gaussian-like pdf; at larger angles it is described by single-event scatters and retains a pdf that follows the form of the Rutherford differential cross-section. The FAS method is verified using discrete Monte-Carlo scattering simulations run at small timesteps to include each individual scattering event. We identify the FAS regime of interest as where the ratio of temporal/spatial scale-of-interest to slowing-down time/length is from 10-3 to 0.3-0.7; the upper limit corresponds to Coulomb logarithm of 20-2, respectively. Two test problems, high-velocity interpenetrating plasma flows and keV-temperature ion equilibration, are used to highlight systems where including FAS is important to capture relevant physics.

  1. Crude incidence in two-phase designs in the presence of competing risks.

    PubMed

    Rebora, Paola; Antolini, Laura; Glidden, David V; Valsecchi, Maria Grazia

    2016-01-11

    In many studies, some information might not be available for the whole cohort, some covariates, or even the outcome, might be ascertained in selected subsamples. These studies are part of a broad category termed two-phase studies. Common examples include the nested case-control and the case-cohort designs. For two-phase studies, appropriate weighted survival estimates have been derived; however, no estimator of cumulative incidence accounting for competing events has been proposed. This is relevant in the presence of multiple types of events, where estimation of event type specific quantities are needed for evaluating outcome. We develop a non parametric estimator of the cumulative incidence function of events accounting for possible competing events. It handles a general sampling design by weights derived from the sampling probabilities. The variance is derived from the influence function of the subdistribution hazard. The proposed method shows good performance in simulations. It is applied to estimate the crude incidence of relapse in childhood acute lymphoblastic leukemia in groups defined by a genotype not available for everyone in a cohort of nearly 2000 patients, where death due to toxicity acted as a competing event. In a second example the aim was to estimate engagement in care of a cohort of HIV patients in resource limited setting, where for some patients the outcome itself was missing due to lost to follow-up. A sampling based approach was used to identify outcome in a subsample of lost patients and to obtain a valid estimate of connection to care. A valid estimator for cumulative incidence of events accounting for competing risks under a general sampling design from an infinite target population is derived.

  2. Probability and Conditional Probability of Cumulative Cloud Cover for Selected Stations Worldwide.

    DTIC Science & Technology

    1985-07-01

    INTRODUCTION The performance of precision-guided munition (PGM) systems may be severely compromised by the presence of clouds in the desired target...Korea 37.98 N 12794 E Mar 67-Dec 79 4Ku san, Korea 37.90 N 126.63 E Jaug 51-Dec 81 (No Jan 71-Dec 72) 4 -7141- Taegu & Tonchon, Korea 35.90 N 128.67 E Jan

  3. Evaluation of carotid plaque echogenicity based on the integral of the cumulative probability distribution using gray-scale ultrasound images.

    PubMed

    Huang, Xiaowei; Zhang, Yanling; Meng, Long; Abbott, Derek; Qian, Ming; Wong, Kelvin K L; Zheng, Rongqing; Zheng, Hairong; Niu, Lili

    2017-01-01

    Carotid plaque echogenicity is associated with the risk of cardiovascular events. Gray-scale median (GSM) of the ultrasound image of carotid plaques has been widely used as an objective method for evaluation of plaque echogenicity in patients with atherosclerosis. We proposed a computer-aided method to evaluate plaque echogenicity and compared its efficiency with GSM. One hundred and twenty-five carotid plaques (43 echo-rich, 35 intermediate, 47 echolucent) were collected from 72 patients in this study. The cumulative probability distribution curves were obtained based on statistics of the pixels in the gray-level images of plaques. The area under the cumulative probability distribution curve (AUCPDC) was calculated as its integral value to evaluate plaque echogenicity. The classification accuracy for three types of plaques is 78.4% (kappa value, κ = 0.673), when the AUCPDC is used for classifier training, whereas GSM is 64.8% (κ = 0.460). The receiver operating characteristic curves were produced to test the effectiveness of AUCPDC and GSM for the identification of echolucent plaques. The area under the curve (AUC) was 0.817 when AUCPDC was used for training the classifier, which is higher than that achieved using GSM (AUC = 0.746). Compared with GSM, the AUCPDC showed a borderline association with coronary heart disease (Spearman r = 0.234, p = 0.050). Our experimental results suggest that AUCPDC analysis is a promising method for evaluation of plaque echogenicity and predicting cardiovascular events in patients with plaques.

  4. Does ecosystem variability explain phytoplankton diversity? Solving an ecological puzzle with long-term data sets

    NASA Astrophysics Data System (ADS)

    Sarker, Subrata; Lemke, Peter; Wiltshire, Karen H.

    2018-05-01

    Explaining species diversity as a function of ecosystem variability is a long-term discussion in community-ecology research. Here, we aimed to establish a causal relationship between ecosystem variability and phytoplankton diversity in a shallow-sea ecosystem. We used long-term data on biotic and abiotic factors from Helgoland Roads, along with climate data to assess the effect of ecosystem variability on phytoplankton diversity. A point cumulative semi-variogram method was used to estimate the long-term ecosystem variability. A Markov chain model was used to estimate dynamical processes of species i.e. occurrence, absence and outcompete probability. We identified that the 1980s was a period of high ecosystem variability while the last two decades were comparatively less variable. Ecosystem variability was found as an important predictor of phytoplankton diversity at Helgoland Roads. High diversity was related to low ecosystem variability due to non-significant relationship between probability of a species occurrence and absence, significant negative relationship between probability of a species occurrence and probability of a species to be outcompeted by others, and high species occurrence at low ecosystem variability. Using an exceptional marine long-term data set, this study established a causal relationship between ecosystem variability and phytoplankton diversity.

  5. A moment-convergence method for stochastic analysis of biochemical reaction networks.

    PubMed

    Zhang, Jiajun; Nie, Qing; Zhou, Tianshou

    2016-05-21

    Traditional moment-closure methods need to assume that high-order cumulants of a probability distribution approximate to zero. However, this strong assumption is not satisfied for many biochemical reaction networks. Here, we introduce convergent moments (defined in mathematics as the coefficients in the Taylor expansion of the probability-generating function at some point) to overcome this drawback of the moment-closure methods. As such, we develop a new analysis method for stochastic chemical kinetics. This method provides an accurate approximation for the master probability equation (MPE). In particular, the connection between low-order convergent moments and rate constants can be more easily derived in terms of explicit and analytical forms, allowing insights that would be difficult to obtain through direct simulation or manipulation of the MPE. In addition, it provides an accurate and efficient way to compute steady-state or transient probability distribution, avoiding the algorithmic difficulty associated with stiffness of the MPE due to large differences in sizes of rate constants. Applications of the method to several systems reveal nontrivial stochastic mechanisms of gene expression dynamics, e.g., intrinsic fluctuations can induce transient bimodality and amplify transient signals, and slow switching between promoter states can increase fluctuations in spatially heterogeneous signals. The overall approach has broad applications in modeling, analysis, and computation of complex biochemical networks with intrinsic noise.

  6. Large deviation principle at work: Computation of the statistical properties of the exact one-point aperture mass

    NASA Astrophysics Data System (ADS)

    Reimberg, Paulo; Bernardeau, Francis

    2018-01-01

    We present a formalism based on the large deviation principle (LDP) applied to cosmological density fields, and more specifically to the arbitrary functional of density profiles, and we apply it to the derivation of the cumulant generating function and one-point probability distribution function (PDF) of the aperture mass (Map ), a common observable for cosmic shear observations. We show that the LDP can indeed be used in practice for a much larger family of observables than previously envisioned, such as those built from continuous and nonlinear functionals of density profiles. Taking advantage of this formalism, we can extend previous results, which were based on crude definitions of the aperture mass, with top-hat windows and the use of the reduced shear approximation (replacing the reduced shear with the shear itself). We were precisely able to quantify how this latter approximation affects the Map statistical properties. In particular, we derive the corrective term for the skewness of the Map and reconstruct its one-point PDF.

  7. Statistical Optimality in Multipartite Ranking and Ordinal Regression.

    PubMed

    Uematsu, Kazuki; Lee, Yoonkyung

    2015-05-01

    Statistical optimality in multipartite ranking is investigated as an extension of bipartite ranking. We consider the optimality of ranking algorithms through minimization of the theoretical risk which combines pairwise ranking errors of ordinal categories with differential ranking costs. The extension shows that for a certain class of convex loss functions including exponential loss, the optimal ranking function can be represented as a ratio of weighted conditional probability of upper categories to lower categories, where the weights are given by the misranking costs. This result also bridges traditional ranking methods such as proportional odds model in statistics with various ranking algorithms in machine learning. Further, the analysis of multipartite ranking with different costs provides a new perspective on non-smooth list-wise ranking measures such as the discounted cumulative gain and preference learning. We illustrate our findings with simulation study and real data analysis.

  8. Cumulants vs correlation functions and the QCD phase diagram at low energies

    DOE PAGES

    Bzdak, A.; Koch, V.; Skokov, V.; ...

    2017-09-25

    We discuss the relation between particle number cumulants and genuine correlation functions. Here, it is argued that measuring multi-particle correlation functions could provide cleaner information on possible non-trivial dynamics in heavy-ion collisions.

  9. Cumulants vs correlation functions and the QCD phase diagram at low energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bzdak, A.; Koch, V.; Skokov, V.

    We discuss the relation between particle number cumulants and genuine correlation functions. Here, it is argued that measuring multi-particle correlation functions could provide cleaner information on possible non-trivial dynamics in heavy-ion collisions.

  10. Model-independent analyses of non-Gaussianity in Planck CMB maps using Minkowski functionals

    NASA Astrophysics Data System (ADS)

    Buchert, Thomas; France, Martin J.; Steiner, Frank

    2017-05-01

    Despite the wealth of Planck results, there are difficulties in disentangling the primordial non-Gaussianity of the Cosmic Microwave Background (CMB) from the secondary and the foreground non-Gaussianity (NG). For each of these forms of NG the lack of complete data introduces model-dependences. Aiming at detecting the NGs of the CMB temperature anisotropy δ T , while paying particular attention to a model-independent quantification of NGs, our analysis is based upon statistical and morphological univariate descriptors, respectively: the probability density function P(δ T) , related to v0, the first Minkowski Functional (MF), and the two other MFs, v1 and v2. From their analytical Gaussian predictions we build the discrepancy functions {{ Δ }k} (k  =  P, 0, 1, 2) which are applied to an ensemble of 105 CMB realization maps of the Λ CDM model and to the Planck CMB maps. In our analysis we use general Hermite expansions of the {{ Δ }k} up to the 12th order, where the coefficients are explicitly given in terms of cumulants. Assuming hierarchical ordering of the cumulants, we obtain the perturbative expansions generalizing the second order expansions of Matsubara to arbitrary order in the standard deviation {σ0} for P(δ T) and v0, where the perturbative expansion coefficients are explicitly given in terms of complete Bell polynomials. The comparison of the Hermite expansions and the perturbative expansions is performed for the Λ CDM map sample and the Planck data. We confirm the weak level of non-Gaussianity (1-2)σ of the foreground corrected masked Planck 2015 maps.

  11. Optimization of the transmission of observable expectation values and observable statistics in continuous-variable teleportation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albano Farias, L.; Stephany, J.

    2010-12-15

    We analyze the statistics of observables in continuous-variable (CV) quantum teleportation in the formalism of the characteristic function. We derive expressions for average values of output-state observables, in particular, cumulants which are additive in terms of the input state and the resource of teleportation. Working with a general class of teleportation resources, the squeezed-bell-like states, which may be optimized in a free parameter for better teleportation performance, we discuss the relation between resources optimal for fidelity and those optimal for different observable averages. We obtain the values of the free parameter of the squeezed-bell-like states which optimize the central momentamore » and cumulants up to fourth order. For the cumulants the distortion between in and out states due to teleportation depends only on the resource. We obtain optimal parameters {Delta}{sub (2)}{sup opt} and {Delta}{sub (4)}{sup opt} for the second- and fourth-order cumulants, which do not depend on the squeezing of the resource. The second-order central momenta, which are equal to the second-order cumulants, and the photon number average are also optimized by the resource with {Delta}{sub (2)}{sup opt}. We show that the optimal fidelity resource, which has been found previously to depend on the characteristics of input, approaches for high squeezing to the resource that optimizes the second-order momenta. A similar behavior is obtained for the resource that optimizes the photon statistics, which is treated here using the sum of the squared differences in photon probabilities of input versus output states as the distortion measure. This is interpreted naturally to mean that the distortions associated with second-order momenta dominate the behavior of the output state for large squeezing of the resource. Optimal fidelity resources and optimal photon statistics resources are compared, and it is shown that for mixtures of Fock states both resources are equivalent.« less

  12. Statistical Comparison and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; McGhee, David S.

    2004-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield ultimate strength and high cycle fatigue capability. This paper examines the cumulative distribution function (CDF) percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematics is then used to calculate the combined value corresponding to any desired percentile along with a curve fit to this value. Another Excel macro is used to calculate the combination using a Monte Carlo simulation. Unlike the traditional techniques, these methods quantify the calculated load value with a Consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Also, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can lower the design loading substantially without losing any of the identified structural reliability.

  13. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  14. Effects of the financial crisis on the wealth distribution of Korea's companies

    NASA Astrophysics Data System (ADS)

    Lim, Kyuseong; Kim, Soo Yong; Swanson, Todd; Kim, Jooyun

    2017-02-01

    We investigated the distribution functions of Korea's top-rated companies during two financial crises. A power-law scaling for rank distribution, as well as cumulative probability distribution, was found and observed as a general pattern. Similar distributions can be shown in other studies of wealth and income distributions. In our study, the Pareto exponents designating the distribution differed before and after the crisis. The companies covered in this research are divided into two subgroups during a period when the subprime mortgage crisis occurred. Various industrial sectors of Korea's companies were found to respond differently during the two financial crises, especially the construction sector, financial sectors, and insurance groups.

  15. Longitudinal Pathways from Cumulative Contextual Risk at Birth to School Functioning in Adolescence: Analysis of Mediation Effects and Gender Moderation.

    PubMed

    January, Stacy-Ann A; Mason, W Alex; Savolainen, Jukka; Solomon, Starr; Chmelka, Mary B; Miettunen, Jouko; Veijola, Juha; Moilanen, Irma; Taanila, Anja; Järvelin, Marjo-Riitta

    2017-01-01

    Children and adolescents exposed to multiple contextual risks are more likely to have academic difficulties and externalizing behavior problems than those who experience fewer risks. This study used data from the Northern Finland Birth Cohort 1986 (a population-based study; N = 6961; 51 % female) to investigate (a) the impact of cumulative contextual risk at birth on adolescents' academic performance and misbehavior in school, (b) learning difficulties and/or externalizing behavior problems in childhood as intervening mechanisms in the association of cumulative contextual risk with functioning in adolescence, and (c) potential gender differences in the predictive associations of cumulative contextual risk at birth with functioning in childhood or adolescence. The results of the structural equation modeling analysis suggested that exposure to cumulative contextual risk at birth had negative associations with functioning 16 years later, and academic difficulties and externalizing behavior problems in childhood mediated some of the predictive relations. Gender, however, did not moderate any of the associations. Therefore, the findings of this study have implications for the prevention of learning and conduct problems in youth and future research on the impact of cumulative risk exposure.

  16. Longitudinal Pathways from Cumulative Contextual Risk at Birth to School Functioning in Adolescence: Analysis of Mediation Effects and Gender Moderation

    PubMed Central

    January, Stacy-Ann A.; Mason, W. Alex; Savolainen, Jukka; Solomon, Starr; Chmelka, Mary B.; Miettunen, Jouko; Veijola, Juha; Moilanen, Irma; Taanila, Anja; Järvelin, Marjo-Riitta

    2016-01-01

    Children and adolescents exposed to multiple contextual risks are more likely to have academic difficulties and externalizing behavior problems than those who experience fewer risks. This study used data from the Northern Finland Birth Cohort 1986 (a population-based study; N = 6,961; 51% female) to investigate (a) the impact of cumulative contextual risk at birth on adolescents’ academic performance and misbehavior in school, (b) learning difficulties and/or externalizing behavior problems in childhood as intervening mechanisms in the association of cumulative contextual risk with functioning in adolescence, and (c) potential gender differences in the predictive associations of cumulative contextual risk at birth with functioning in childhood or adolescence. The results of the structural equation modeling analysis suggested that exposure to cumulative contextual risk at birth had negative associations with functioning 16 years later, and academic difficulties and externalizing behavior problems in childhood mediated some of the predictive relations. Gender, however, did not moderate any of the associations. Therefore, the findings of this study have implications for the prevention of learning and conduct problems in youth and future research on the impact of cumulative risk exposure. PMID:27665276

  17. Full counting statistics of conductance for disordered systems

    NASA Astrophysics Data System (ADS)

    Fu, Bin; Zhang, Lei; Wei, Yadong; Wang, Jian

    2017-09-01

    Quantum transport is a stochastic process in nature. As a result, the conductance is fully characterized by its average value and fluctuations, i.e., characterized by full counting statistics (FCS). Since disorders are inevitable in nanoelectronic devices, it is important to understand how FCS behaves in disordered systems. The traditional approach dealing with fluctuations or cumulants of conductance uses diagrammatic perturbation expansion of the Green's function within coherent potential approximation (CPA), which is extremely complicated especially for high order cumulants. In this paper, we develop a theoretical formalism based on nonequilibrium Green's function by directly taking the disorder average on the generating function of FCS of conductance within CPA. This is done by mapping the problem into higher dimensions so that the functional dependence of generating a function on the Green's function becomes linear and the diagrammatic perturbation expansion is not needed anymore. Our theory is very simple and allows us to calculate cumulants of conductance at any desired order efficiently. As an application of our theory, we calculate the cumulants of conductance up to fifth order for disordered systems in the presence of Anderson and binary disorders. Our numerical results of cumulants of conductance show remarkable agreement with that obtained by the brute force calculation.

  18. Evaluation of the HF-Radar network system around Taiwan using normalized cumulative Lagrangian separation.

    NASA Astrophysics Data System (ADS)

    Fredj, Erick; Kohut, Josh; Roarty, Hugh; Lai, Jian-Wu

    2017-04-01

    The Lagrangian separation distance between the endpoints of simulated and observed drifter trajectories is often used to assess the performance of numerical particle trajectory models. However, the separation distance fails to indicate relative model performance in weak and strong current regions, such as over continental shelves and the adjacent deep ocean. A skill score described in detail by (Lui et.al. 2011) was applied to estimate the cumulative Lagrangian separation distances normalized by the associated cumulative trajectory lengths. In contrast, the Lagrangian separation distance alone gives a misleading result. The proposed dimensionless skill score is particularly useful when the number of drifter trajectories is limited and neither a conventional Eulerian-based velocity nor a Lagrangian based probability density function may be estimated. The skill score assesses The Taiwan Ocean Radar Observing System (TOROS) performance. TOROS consists of 17 SeaSonde type radars around the Taiwan Island. The currents off Taiwan are significantly influenced by the nearby Kuroshio current. The main stream of the Kuroshio flows along the east coast of Taiwan to the north throughout the year. Sometimes its branch current also bypasses the south end of Taiwan and goes north along the west coast of Taiwan. The Kuroshio is also prone to seasonal change in its speed of flow, current capacity, distribution width, and depth. The evaluations of HF-Radar National Taiwanese network performance using Lagrangian drifter records demonstrated the high quality and robustness of TOROS HF-Radar data using a purely trajectory-based non-dimensional index. Yonggang Liu and Robert H. Weisberg, "Evaluation of trajectory modeling in different dynamic regions using normalized cumulative Lagrangian separation", Journal of Geophysical Research, Vol. 116, C09013, doi:10.1029/2010JC006837, 2011

  19. Modelling the occurrence and severity of enoxaparin-induced bleeding and bruising events

    PubMed Central

    Barras, Michael A; Duffull, Stephen B; Atherton, John J; Green, Bruce

    2009-01-01

    AIMS To develop a population pharmacokinetic–pharmacodynamic model to describe the occurrence and severity of bleeding or bruising as a function of enoxaparin exposure. METHODS Data were obtained from a randomized controlled trial (n = 118) that compared conventional dosing of enoxaparin (product label) with an individualized dosing regimen. Anti-Xa concentrations were sampled using a sparse design and the size, location and type of bruising and bleeding event, during enoxaparin therapy, were collected daily. A population pharmacokinetic–pharmacodynamic analysis was performed using nonlinear mixed effects techniques. The final model was used to explore how the probability of events in patients with obesity and/or renal impairment varied under differing dosing strategies. RESULTS Three hundred and forty-nine anti-Xa concentrations were available for analysis. A two-compartment first-order absorption and elimination model best fit the data, with lean body weight describing between-subject variability in clearance and central volume of distribution. A three-category proportional-odds model described the occurrence and severity of events as a function of both cumulative enoxaparin AUC (cAUC) and subject age. Simulations showed that individualized dosing decreased the probability of a bleeding or major bruising event when compared with conventional dosing, which was most noticeable in subjects with obesity and renal impairment. CONCLUSIONS The occurrence and severity of a bleeding or major bruising event to enoxaparin, administered for the treatment of a thromboembolic disease, can be described as a function of both cAUC and subject age. Individualized dosing of enoxaparin will reduce the probability of an event. PMID:19916994

  20. Coupled Multi-Disciplinary Optimization for Structural Reliability and Affordability

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    A computational simulation method is presented for Non-Deterministic Multidisciplinary Optimization of engine composite materials and structures. A hypothetical engine duct made with ceramic matrix composites (CMC) is evaluated probabilistically in the presence of combined thermo-mechanical loading. The structure is tailored by quantifying the uncertainties in all relevant design variables such as fabrication, material, and loading parameters. The probabilistic sensitivities are used to select critical design variables for optimization. In this paper, two approaches for non-deterministic optimization are presented. The non-deterministic minimization of combined failure stress criterion is carried out by: (1) performing probabilistic evaluation first and then optimization and (2) performing optimization first and then probabilistic evaluation. The first approach shows that the optimization feasible region can be bounded by a set of prescribed probability limits and that the optimization follows the cumulative distribution function between those limits. The second approach shows that the optimization feasible region is bounded by 0.50 and 0.999 probabilities.

  1. Infomax Strategies for an Optimal Balance Between Exploration and Exploitation

    NASA Astrophysics Data System (ADS)

    Reddy, Gautam; Celani, Antonio; Vergassola, Massimo

    2016-06-01

    Proper balance between exploitation and exploration is what makes good decisions that achieve high reward, like payoff or evolutionary fitness. The Infomax principle postulates that maximization of information directs the function of diverse systems, from living systems to artificial neural networks. While specific applications turn out to be successful, the validity of information as a proxy for reward remains unclear. Here, we consider the multi-armed bandit decision problem, which features arms (slot-machines) of unknown probabilities of success and a player trying to maximize cumulative payoff by choosing the sequence of arms to play. We show that an Infomax strategy (Info-p) which optimally gathers information on the highest probability of success among the arms, saturates known optimal bounds and compares favorably to existing policies. Conversely, gathering information on the identity of the best arm in the bandit leads to a strategy that is vastly suboptimal in terms of payoff. The nature of the quantity selected for Infomax acquisition is then crucial for effective tradeoffs between exploration and exploitation.

  2. The Simulation Heuristic.

    DTIC Science & Technology

    1981-05-15

    Crane. is capable of imagining unicorns -- and we expect he is -- why does he find it relatively difficult to imagine himself avoiding a 30 minute...probability that the plan will succeed and to evaluate the risk of various causes of failure . We have suggested that the construction of scenarios is...expect that events will unfold as planned. However, the cumulative probability of at least one fatal failure could be overwhelmingly high even when

  3. Decision-making Process by Users and Providers of Health Care Services During the AH1N1 Epidemic Influenza in Mexico: Lessons Learned and Challenges Ahead.

    PubMed

    Huízar-Hernández, Víctor; Arredondo, Armando; Caballero, Marta; Castro-Ríos, Angélica; Flores-Hernández, Sergio; Pérez-Padilla, Rogelio; Reyes-Morales, Hortensia

    2017-04-01

    The aim of the study was to analyze, using a decision analysis approach, the probability of severity of illness due to delayed utilization of health services and inappropriate hospital medical treatment during the 2009 AH1N1 influenza epidemic in Mexico. Patients with influenza AH1N1 confirmed by the polymerase chain reaction (PCR) test from two hospitals in Mexico City, were included. Path methodology based upon literature and validated by clinical experts was followed. The probability for severe illness originated from delayed utilization of health services, delayed prescription of neuraminidase inhibitors (NAIs) and inappropriate use of antibiotics was assessed. Ninety-nine patients were analyzed, and 16% developed severe illness. Most patients received NAIs and 85.9% received antibiotics. Inappropriate use of antibiotics was observed in 70.7% of cases. Early utilization of services increased the likelihood of non-severe illness (cumulative probability CP = 0.56). The major cumulative probability for severe illness was observed when prescription of NAIs was delayed (CP = 0.19). Delayed prescription of NAIs and irrational use of antibiotics are critical decisions for unfavorable outcomes in patients suffering influenza AH1N1. Copyright © 2017 IMSS. Published by Elsevier Inc. All rights reserved.

  4. Weighted comparison of two cumulative incidence functions with R-CIFsmry package.

    PubMed

    Li, Jianing; Le-Rademacher, Jennifer; Zhang, Mei-Jie

    2014-10-01

    In this paper we propose a class of flexible weight functions for use in comparison of two cumulative incidence functions. The proposed weights allow the users to focus their comparison on an early or a late time period post treatment or to treat all time points with equal emphasis. These weight functions can be used to compare two cumulative incidence functions via their risk difference, their relative risk, or their odds ratio. The proposed method has been implemented in the R-CIFsmry package which is readily available for download and is easy to use as illustrated in the example. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Canopy Spectral Invariants. Part 2; Application to Classification of Forest Types from Hyperspectral Data

    NASA Technical Reports Server (NTRS)

    Schull, M. A.; Knyazikhin, Y.; Xu, L.; Samanta, A.; Carmona, P. L.; Lepine, L.; Jenkins, J. P.; Ganguly, S.; Myneni, R. B.

    2011-01-01

    Many studies have been conducted to demonstrate the ability of hyperspectral data to discriminate plant dominant species. Most of them have employed the use of empirically based techniques, which are site specific, requires some initial training based on characteristics of known leaf and/or canopy spectra and therefore may not be extendable to operational use or adapted to changing or unknown land cover. In this paper we propose a physically based approach for separation of dominant forest type using hyperspectral data. The radiative transfer theory of canopy spectral invariants underlies the approach, which facilitates parameterization of the canopy reflectance in terms of the leaf spectral scattering and two spectrally invariant and structurally varying variables - recollision and directional escape probabilities. The methodology is based on the idea of retrieving spectrally invariant parameters from hyperspectral data first, and then relating their values to structural characteristics of three-dimensional canopy structure. Theoretical and empirical analyses of ground and airborne data acquired by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) over two sites in New England, USA, suggest that the canopy spectral invariants convey information about canopy structure at both the macro- and micro-scales. The total escape probability (one minus recollision probability) varies as a power function with the exponent related to the number of nested hierarchical levels present in the pixel. Its base is a geometrical mean of the local total escape probabilities and accounts for the cumulative effect of canopy structure over a wide range of scales. The ratio of the directional to the total escape probability becomes independent of the number of hierarchical levels and is a function of the canopy structure at the macro-scale such as tree spatial distribution, crown shape and size, within-crown foliage density and ground cover. These properties allow for the natural separation of dominant forest classes based on the location of points on the total escape probability vs the ratio log-log plane.

  6. Probability of pregnancy after sterilization: a comparison of hysteroscopic versus laparoscopic sterilization.

    PubMed

    Gariepy, Aileen M; Creinin, Mitchell D; Smith, Kenneth J; Xu, Xiao

    2014-08-01

    To compare the expected probability of pregnancy after hysteroscopic versus laparoscopic sterilization based on available data using decision analysis. We developed an evidence-based Markov model to estimate the probability of pregnancy over 10 years after three different female sterilization procedures: hysteroscopic, laparoscopic silicone rubber band application and laparoscopic bipolar coagulation. Parameter estimates for procedure success, probability of completing follow-up testing and risk of pregnancy after different sterilization procedures were obtained from published sources. In the base case analysis at all points in time after the sterilization procedure, the initial and cumulative risk of pregnancy after sterilization is higher in women opting for hysteroscopic than either laparoscopic band or bipolar sterilization. The expected pregnancy rates per 1000 women at 1 year are 57, 7 and 3 for hysteroscopic sterilization, laparoscopic silicone rubber band application and laparoscopic bipolar coagulation, respectively. At 10 years, the cumulative pregnancy rates per 1000 women are 96, 24 and 30, respectively. Sensitivity analyses suggest that the three procedures would have an equivalent pregnancy risk of approximately 80 per 1000 women at 10 years if the probability of successful laparoscopic (band or bipolar) sterilization drops below 90% and successful coil placement on first hysteroscopic attempt increases to 98% or if the probability of undergoing a hysterosalpingogram increases to 100%. Based on available data, the expected population risk of pregnancy is higher after hysteroscopic than laparoscopic sterilization. Consistent with existing contraceptive classification, future characterization of hysteroscopic sterilization should distinguish "perfect" and "typical" use failure rates. Pregnancy probability at 1 year and over 10 years is expected to be higher in women having hysteroscopic as compared to laparoscopic sterilization. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Diagnostic testing for coagulopathies in patients with ischemic stroke.

    PubMed

    Bushnell, C D; Goldstein, L B

    2000-12-01

    Hypercoagulable states are a recognized, albeit uncommon, etiology of ischemic stroke. It is unclear how often the results of specialized coagulation tests affect management. Using data compiled from a systematic review of available studies, we employed quantitative methodology to assess the diagnostic yield of coagulation tests for identification of coagulopathies in ischemic stroke patients. We performed a MEDLINE search to identify controlled studies published during 1966-1999 that reported the prevalence of deficiencies of protein C, protein S, antithrombin III, plasminogen, activated protein C resistance (APCR)/factor V Leiden mutation (FVL), anticardiolipin antibodies (ACL), or lupus anticoagulant (LA) in patients with ischemic stroke. The cumulative prevalence rates (pretest probabilities) and positive likelihood ratios for all studies and for those including only patients aged

  8. Assessing the effect of a partly unobserved, exogenous, binary time-dependent covariate on survival probabilities using generalised pseudo-values.

    PubMed

    Pötschger, Ulrike; Heinzl, Harald; Valsecchi, Maria Grazia; Mittlböck, Martina

    2018-01-19

    Investigating the impact of a time-dependent intervention on the probability of long-term survival is statistically challenging. A typical example is stem-cell transplantation performed after successful donor identification from registered donors. Here, a suggested simple analysis based on the exogenous donor availability status according to registered donors would allow the estimation and comparison of survival probabilities. As donor search is usually ceased after a patient's event, donor availability status is incompletely observed, so that this simple comparison is not possible and the waiting time to donor identification needs to be addressed in the analysis to avoid bias. It is methodologically unclear, how to directly address cumulative long-term treatment effects without relying on proportional hazards while avoiding waiting time bias. The pseudo-value regression technique is able to handle the first two issues; a novel generalisation of this technique also avoids waiting time bias. Inverse-probability-of-censoring weighting is used to account for the partly unobserved exogenous covariate donor availability. Simulation studies demonstrate unbiasedness and satisfying coverage probabilities of the new method. A real data example demonstrates that study results based on generalised pseudo-values have a clear medical interpretation which supports the clinical decision making process. The proposed generalisation of the pseudo-value regression technique enables to compare survival probabilities between two independent groups where group membership becomes known over time and remains partly unknown. Hence, cumulative long-term treatment effects are directly addressed without relying on proportional hazards while avoiding waiting time bias.

  9. CROSSER - CUMULATIVE BINOMIAL PROGRAMS

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.

  10. Statistical properties of single-mode fiber coupling of satellite-to-ground laser links partially corrected by adaptive optics.

    PubMed

    Canuet, Lucien; Védrenne, Nicolas; Conan, Jean-Marc; Petit, Cyril; Artaud, Geraldine; Rissons, Angelique; Lacan, Jerome

    2018-01-01

    In the framework of satellite-to-ground laser downlinks, an analytical model describing the variations of the instantaneous coupled flux into a single-mode fiber after correction of the incoming wavefront by partial adaptive optics (AO) is presented. Expressions for the probability density function and the cumulative distribution function as well as for the average fading duration and fading duration distribution of the corrected coupled flux are given. These results are of prime interest for the computation of metrics related to coded transmissions over correlated channels, and they are confronted by end-to-end wave-optics simulations in the case of a geosynchronous satellite (GEO)-to-ground and a low earth orbit satellite (LEO)-to-ground scenario. Eventually, the impact of different AO performances on the aforementioned fading duration distribution is analytically investigated for both scenarios.

  11. Divergence of perturbation theory in large scale structures

    NASA Astrophysics Data System (ADS)

    Pajer, Enrico; van der Woude, Drian

    2018-05-01

    We make progress towards an analytical understanding of the regime of validity of perturbation theory for large scale structures and the nature of some non-perturbative corrections. We restrict ourselves to 1D gravitational collapse, for which exact solutions before shell crossing are known. We review the convergence of perturbation theory for the power spectrum, recently proven by McQuinn and White [1], and extend it to non-Gaussian initial conditions and the bispectrum. In contrast, we prove that perturbation theory diverges for the real space two-point correlation function and for the probability density function (PDF) of the density averaged in cells and all the cumulants derived from it. We attribute these divergences to the statistical averaging intrinsic to cosmological observables, which, even on very large and "perturbative" scales, gives non-vanishing weight to all extreme fluctuations. Finally, we discuss some general properties of non-perturbative effects in real space and Fourier space.

  12. Survival Analysis of Patients with End Stage Renal Disease

    NASA Astrophysics Data System (ADS)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  13. Performance analysis for mixed FSO/RF Nakagami-m and Exponentiated Weibull dual-hop airborne systems

    NASA Astrophysics Data System (ADS)

    Jing, Zhao; Shang-hong, Zhao; Wei-hu, Zhao; Ke-fan, Chen

    2017-06-01

    In this paper, the performances of mixed free-space optical (FSO)/radio frequency (RF) systems are presented based on the decode-and-forward relaying. The Exponentiated Weibull fading channel with pointing error effect is adopted for the atmospheric fluctuation of FSO channel and the RF link undergoes the Nakagami-m fading. We derived the analytical expression for cumulative distribution function (CDF) of equivalent signal-to-noise ratio (SNR). The novel mathematical presentations of outage probability and average bit-error-rate (BER) are developed based on the Meijer's G function. The analytical results show an accurately match to the Monte-Carlo simulation results. The outage and BER performance for the mixed system by decode-and-forward relay are investigated considering atmospheric turbulence and pointing error condition. The effect of aperture averaging is evaluated in all atmospheric turbulence conditions as well.

  14. Clinical Impact of Additional Cytogenetic Aberrations, cKIT and RAS Mutations, and Treatment Elements in Pediatric t(8;21)-AML: Results From an International Retrospective Study by the International Berlin-Frankfurt-Münster Study Group

    PubMed Central

    Klein, Kim; Kaspers, Gertjan; Harrison, Christine J.; Beverloo, H. Berna; Reedijk, Ardine; Bongers, Mathilda; Cloos, Jacqueline; Pession, Andrea; Reinhardt, Dirk; Zimmerman, Martin; Creutzig, Ursula; Dworzak, Michael; Alonzo, Todd; Johnston, Donna; Hirsch, Betsy; Zapotocky, Michal; De Moerloose, Barbara; Fynn, Alcira; Lee, Vincent; Taga, Takashi; Tawa, Akio; Auvrignon, Anne; Zeller, Bernward; Forestier, Erik; Salgado, Carmen; Balwierz, Walentyna; Popa, Alexander; Rubnitz, Jeffrey; Raimondi, Susana; Gibson, Brenda

    2015-01-01

    Purpose This retrospective cohort study aimed to determine the predictive relevance of clinical characteristics, additional cytogenetic aberrations, and cKIT and RAS mutations, as well as to evaluate whether specific treatment elements were associated with outcomes in pediatric t(8;21)-positive patients with acute myeloid leukemia (AML). Patients and Methods Karyotypes of 916 pediatric patients with t(8;21)-AML were reviewed for the presence of additional cytogenetic aberrations, and 228 samples were screened for presence of cKIT and RAS mutations. Multivariable regression models were used to assess the relevance of anthracyclines, cytarabine, and etoposide during induction and overall treatment. End points were the probability of achieving complete remission, cumulative incidence of relapse (CIR), probability of event-free survival, and probability of overall survival. Results Of 838 patients included in final analyses, 92% achieved complete remission. The 5-year overall survival, event-free survival, and CIR were 74%, 58%, and 26%, respectively. cKIT mutations and RAS mutations were not significantly associated with outcome. Patients with deletions of chromosome arm 9q [del(9q); n = 104] had a lower probability of complete remission (P = .01). Gain of chromosome 4 (+4; n = 21) was associated with inferior CIR and survival (P < .01). Anthracycline doses greater than 150 mg/m2 and etoposide doses greater than 500 mg/m2 in the first induction course and high-dose cytarabine 3 g/m2 during induction were associated with better outcomes on various end points. Cumulative doses of cytarabine greater than 30 g/m2 and etoposide greater than 1,500 mg/m2 were associated with lower CIR rates and better probability of event-free survival. Conclusion Pediatric patients with t(8;21)-AML and additional del(9q) or additional +4 might not be considered at good risk. Patients with t(8;21)-AML likely benefit from protocols that have high doses of anthracyclines, etoposide, and cytarabine during induction, as well as from protocols comprising cumulative high doses of cytarabine and etoposide. PMID:26573082

  15. Comparison of two fertility-sparing approaches for bilateral borderline ovarian tumours: a randomized controlled study.

    PubMed

    Palomba, S; Zupi, E; Russo, T; Falbo, A; Del Negro, S; Manguso, F; Marconi, D; Tolino, A; Zullo, F

    2007-02-01

    During the childbearing years, the standard fertility-sparing treatment for bilateral borderline ovarian tumours (BOTs) is the unilateral oophorectomy plus controlateral cystectomy. The aim of the present study was to compare the effects of two laparoscopic fertility-sparing surgical procedures for the treatment of bilateral BOTs on recurrence and fertility in young women who desire to conceive as soon as possible. Thirty-two women affected by bilateral early-stage BOTs who desired to conceive were randomized to receive bilateral cystectomy (experimental group, n=15) or oophorectomy plus controlateral cystectomy (control group, n=17). At the first recurrence after childbearing completion, each patient was treated with non-conservative standard treatment. Recurrences and reproductive events were recorded. After a follow-up period of 81 months (19 inter-quartile; 60-96 range), the cumulative pregnancy rate (CPR) (14/15 versus 9/17; P=0.003) and the cumulative probability of first pregnancy (P= 0.011) were significantly higher in the experimental than in control group. No significant (P=0.358) difference between groups was detected in cumulative probability of first recurrence. The laparoscopic bilateral cystectomy followed by non-conservative treatment performed at the first recurrence after the childbearing completion is an effective surgical strategy for patients with bilateral early-stage BOTs who desire to conceive as soon as possible.

  16. A moment-convergence method for stochastic analysis of biochemical reaction networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiajun; Nie, Qing; Zhou, Tianshou, E-mail: mcszhtsh@mail.sysu.edu.cn

    Traditional moment-closure methods need to assume that high-order cumulants of a probability distribution approximate to zero. However, this strong assumption is not satisfied for many biochemical reaction networks. Here, we introduce convergent moments (defined in mathematics as the coefficients in the Taylor expansion of the probability-generating function at some point) to overcome this drawback of the moment-closure methods. As such, we develop a new analysis method for stochastic chemical kinetics. This method provides an accurate approximation for the master probability equation (MPE). In particular, the connection between low-order convergent moments and rate constants can be more easily derived in termsmore » of explicit and analytical forms, allowing insights that would be difficult to obtain through direct simulation or manipulation of the MPE. In addition, it provides an accurate and efficient way to compute steady-state or transient probability distribution, avoiding the algorithmic difficulty associated with stiffness of the MPE due to large differences in sizes of rate constants. Applications of the method to several systems reveal nontrivial stochastic mechanisms of gene expression dynamics, e.g., intrinsic fluctuations can induce transient bimodality and amplify transient signals, and slow switching between promoter states can increase fluctuations in spatially heterogeneous signals. The overall approach has broad applications in modeling, analysis, and computation of complex biochemical networks with intrinsic noise.« less

  17. Study of cumulative fatigue damage detection for used parts with nonlinear output frequency response functions based on NARMAX modelling

    NASA Astrophysics Data System (ADS)

    Huang, Honglan; Mao, Hanying; Mao, Hanling; Zheng, Weixue; Huang, Zhenfeng; Li, Xinxin; Wang, Xianghong

    2017-12-01

    Cumulative fatigue damage detection for used parts plays a key role in the process of remanufacturing engineering and is related to the service safety of the remanufactured parts. In light of the nonlinear properties of used parts caused by cumulative fatigue damage, the based nonlinear output frequency response functions detection approach offers a breakthrough to solve this key problem. First, a modified PSO-adaptive lasso algorithm is introduced to improve the accuracy of the NARMAX model under impulse hammer excitation, and then, an effective new algorithm is derived to estimate the nonlinear output frequency response functions under rectangular pulse excitation, and a based nonlinear output frequency response functions index is introduced to detect the cumulative fatigue damage in used parts. Then, a novel damage detection approach that integrates the NARMAX model and the rectangular pulse is proposed for nonlinear output frequency response functions identification and cumulative fatigue damage detection of used parts. Finally, experimental studies of fatigued plate specimens and used connecting rod parts are conducted to verify the validity of the novel approach. The obtained results reveal that the new approach can detect cumulative fatigue damages of used parts effectively and efficiently and that the various values of the based nonlinear output frequency response functions index can be used to detect the different fatigue damages or working time. Since the proposed new approach can extract nonlinear properties of systems by only a single excitation of the inspected system, it shows great promise for use in remanufacturing engineering applications.

  18. A cumulant functional for static and dynamic correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollett, Joshua W., E-mail: j.hollett@uwinnipeg.ca; Department of Chemistry, University of Manitoba, Winnipeg, Manitoba R3T 2N2; Hosseini, Hessam

    A functional for the cumulant energy is introduced. The functional is composed of a pair-correction and static and dynamic correlation energy components. The pair-correction and static correlation energies are functionals of the natural orbitals and the occupancy transferred between near-degenerate orbital pairs, rather than the orbital occupancies themselves. The dynamic correlation energy is a functional of the statically correlated on-top two-electron density. The on-top density functional used in this study is the well-known Colle-Salvetti functional. Using the cc-pVTZ basis set, the functional effectively models the bond dissociation of H{sub 2}, LiH, and N{sub 2} with equilibrium bond lengths and dissociationmore » energies comparable to those provided by multireference second-order perturbation theory. The performance of the cumulant functional is less impressive for HF and F{sub 2}, mainly due to an underestimation of the dynamic correlation energy by the Colle-Salvetti functional.« less

  19. Effect of Individual Component Life Distribution on Engine Life Prediction

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

    2003-01-01

    The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

  20. Head Impact Exposure in Youth Football: High School Ages 14 to 18 Years and Cumulative Impact Analysis

    PubMed Central

    Urban, Jillian E.; Davenport, Elizabeth M.; Golman, Adam J.; Maldjian, Joseph A.; Whitlow, Christopher T.; Powers, Alexander K.; Stitzel, Joel D.

    2015-01-01

    Sports-related concussion is the most common athletic head injury with football having the highest rate among high school athletes. Traditionally, research on the biomechanics of football-related head impact has been focused at the collegiate level. Less research has been performed at the high school level, despite the incidence of concussion among high school football players. The objective of this study is to twofold: to quantify the head impact exposure in high school football, and to develop a cumulative impact analysis method. Head impact exposure was measured by instrumenting the helmets of 40 high school football players with helmet mounted accelerometer arrays to measure linear and rotational acceleration. A total of 16,502 head impacts were collected over the course of the season. Biomechanical data were analyzed by team and by player. The median impact for each player ranged from 15.2 to 27.0 g with an average value of 21.7 (±2.4) g. The 95th percentile impact for each player ranged from 38.8 to 72.9 g with an average value of 56.4 (±10.5) g. Next, an impact exposure metric utilizing concussion injury risk curves was created to quantify cumulative exposure for each participating player over the course of the season. Impacts were weighted according to the associated risk due to linear acceleration and rotational acceleration alone, as well as the combined probability (CP) of injury associated with both. These risks were summed over the course of a season to generate risk weighted cumulative exposure. The impact frequency was found to be greater during games compared to practices with an average number of impacts per session of 15.5 and 9.4, respectively. However, the median cumulative risk weighted exposure based on combined probability was found to be greater for practices vs. games. These data will provide a metric that may be used to better understand the cumulative effects of repetitive head impacts, injury mechanisms, and head impact exposure of athletes in football. PMID:23864337

  1. Effective distances for epidemics spreading on complex networks.

    PubMed

    Iannelli, Flavio; Koher, Andreas; Brockmann, Dirk; Hövel, Philipp; Sokolov, Igor M

    2017-01-01

    We show that the recently introduced logarithmic metrics used to predict disease arrival times on complex networks are approximations of more general network-based measures derived from random walks theory. Using the daily air-traffic transportation data we perform numerical experiments to compare the infection arrival time with this alternative metric that is obtained by accounting for multiple walks instead of only the most probable path. The comparison with direct simulations reveals a higher correlation compared to the shortest-path approach used previously. In addition our method allows to connect fundamental observables in epidemic spreading with the cumulant-generating function of the hitting time for a Markov chain. Our results provides a general and computationally efficient approach using only algebraic methods.

  2. Effective distances for epidemics spreading on complex networks

    NASA Astrophysics Data System (ADS)

    Iannelli, Flavio; Koher, Andreas; Brockmann, Dirk; Hövel, Philipp; Sokolov, Igor M.

    2017-01-01

    We show that the recently introduced logarithmic metrics used to predict disease arrival times on complex networks are approximations of more general network-based measures derived from random walks theory. Using the daily air-traffic transportation data we perform numerical experiments to compare the infection arrival time with this alternative metric that is obtained by accounting for multiple walks instead of only the most probable path. The comparison with direct simulations reveals a higher correlation compared to the shortest-path approach used previously. In addition our method allows to connect fundamental observables in epidemic spreading with the cumulant-generating function of the hitting time for a Markov chain. Our results provides a general and computationally efficient approach using only algebraic methods.

  3. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  4. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  5. Generic finite size scaling for discontinuous nonequilibrium phase transitions into absorbing states

    NASA Astrophysics Data System (ADS)

    de Oliveira, M. M.; da Luz, M. G. E.; Fiore, C. E.

    2015-12-01

    Based on quasistationary distribution ideas, a general finite size scaling theory is proposed for discontinuous nonequilibrium phase transitions into absorbing states. Analogously to the equilibrium case, we show that quantities such as response functions, cumulants, and equal area probability distributions all scale with the volume, thus allowing proper estimates for the thermodynamic limit. To illustrate these results, five very distinct lattice models displaying nonequilibrium transitions—to single and infinitely many absorbing states—are investigated. The innate difficulties in analyzing absorbing phase transitions are circumvented through quasistationary simulation methods. Our findings (allied to numerical studies in the literature) strongly point to a unifying discontinuous phase transition scaling behavior for equilibrium and this important class of nonequilibrium systems.

  6. Exact Large-Deviation Statistics for a Nonequilibrium Quantum Spin Chain

    NASA Astrophysics Data System (ADS)

    Žnidarič, Marko

    2014-01-01

    We consider a one-dimensional XX spin chain in a nonequilibrium setting with a Lindblad-type boundary driving. By calculating large-deviation rate function in the thermodynamic limit, a generalization of free energy to a nonequilibrium setting, we obtain a complete distribution of current, including closed expressions for lower-order cumulants. We also identify two phase-transition-like behaviors in either the thermodynamic limit, at which the current probability distribution becomes discontinuous, or at maximal driving, when the range of possible current values changes discontinuously. In the thermodynamic limit the current has a finite upper and lower bound. We also explicitly confirm nonequilibrium fluctuation relation and show that the current distribution is the same under mapping of the coupling strength Γ→1/Γ.

  7. Probabilistic Component Mode Synthesis of Nondeterministic Substructures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1996-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. We present a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  8. Cumulants and correlation functions versus the QCD phase diagram

    DOE PAGES

    Bzdak, Adam; Koch, Volker; Strodthoff, Nils

    2017-05-12

    Here, we discuss the relation of particle number cumulants and correlation functions. It is argued that measuring couplings of the genuine multiparticle correlation functions could provide cleaner information on possible nontrivial dynamics in heavy-ion collisions. We also extract integrated multiproton correlation functions from the presently available experimental data on proton cumulants. We find that the STAR data contain significant four-proton correlations, at least at the lower energies, with indication of changing dynamics in central collisions. We also find that these correlations are rather long ranged in rapidity. Finally, using the Ising model, we demonstrate how the signs of the multiprotonmore » correlation functions may be used to exclude certain regions of the phase diagram close to the critical point.« less

  9. Cumulants and correlation functions versus the QCD phase diagram

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bzdak, Adam; Koch, Volker; Strodthoff, Nils

    Here, we discuss the relation of particle number cumulants and correlation functions. It is argued that measuring couplings of the genuine multiparticle correlation functions could provide cleaner information on possible nontrivial dynamics in heavy-ion collisions. We also extract integrated multiproton correlation functions from the presently available experimental data on proton cumulants. We find that the STAR data contain significant four-proton correlations, at least at the lower energies, with indication of changing dynamics in central collisions. We also find that these correlations are rather long ranged in rapidity. Finally, using the Ising model, we demonstrate how the signs of the multiprotonmore » correlation functions may be used to exclude certain regions of the phase diagram close to the critical point.« less

  10. Cumulative hazard: The case of nuisance flooding

    NASA Astrophysics Data System (ADS)

    Moftakhari, Hamed R.; AghaKouchak, Amir; Sanders, Brett F.; Matthew, Richard A.

    2017-02-01

    The cumulative cost of frequent events (e.g., nuisance floods) over time may exceed the costs of the extreme but infrequent events for which societies typically prepare. Here we analyze the likelihood of exceedances above mean higher high water and the corresponding property value exposure for minor, major, and extreme coastal floods. Our results suggest that, in response to sea level rise, nuisance flooding (NF) could generate property value exposure comparable to, or larger than, extreme events. Determining whether (and when) low cost, nuisance incidents aggregate into high cost impacts and deciding when to invest in preventive measures are among the most difficult decisions for policymakers. It would be unfortunate if efforts to protect societies from extreme events (e.g., 0.01 annual probability) left them exposed to a cumulative hazard with enormous costs. We propose a Cumulative Hazard Index (CHI) as a tool for framing the future cumulative impact of low cost incidents relative to infrequent extreme events. CHI suggests that in New York, NY, Washington, DC, Miami, FL, San Francisco, CA, and Seattle, WA, a careful consideration of socioeconomic impacts of NF for prioritization is crucial for sustainable coastal flood risk management.

  11. Survival potential of Phytophthora infestans sporangia in relation to meteorological factors

    USDA-ARS?s Scientific Manuscript database

    Assessment of meteorological factors coupled with sporangia survival curves may enhance effective management of potato late blight, caused by Phytophthora infestans. We utilized a non-parametric density estimation approach to evaluate the cumulative probability of occurrence of temperature and relat...

  12. TESTING FOR DIFFERENCES BETWEEN CUMULATIVE DISTRIBUTION FUNCTIONS FROM COMPLEX ENVIRONMENTAL SAMPLING SURVEYS

    EPA Science Inventory

    The U.S. Environmental Protection Agency's Environmental Monitoring and Assessment Program (EMAP) employs the cumulative distribution function (cdf) to measure the status of quantitative variables for resources of interest. The ability to compare cdf's for a resource from, say,...

  13. Natural history of major complications in hepatitis C virus-related cirrhosis evaluated by per-rectal portal scintigraphy

    PubMed Central

    Kawamura, Etsushi; Habu, Daiki; Hayashi, Takehiro; Oe, Ai; Kotani, Jin; Ishizu, Hirotaka; Torii, Kenji; Kawabe, Joji; Fukushima, Wakaba; Tanaka, Takashi; Nishiguchi, Shuhei; Shiomi, Susumu

    2005-01-01

    AIM: To examine the correlation between the porto-systemic hypertension evaluated by portal shunt index (PSI) and life-threatening complications, including hepatocellular carcinoma (HCC), liver failure (Child-Pugh stage progression), and esophagogastric varices. METHODS: Two hundred and twelve consecutive subjects with HCV-related cirrhosis (LC-C) underwent per-rectal portal scintigraphy. They were allocated into three groups according to their PSI: group I, PSI ≤ 10%; group II, 10%

  14. Older adults' transportation walking: a cross-sectional study on the cumulative influence of physical environmental factors.

    PubMed

    Van Cauwenberg, Jelle; Clarys, Peter; De Bourdeaudhuij, Ilse; Van Holle, Veerle; Verté, Dominique; De Witte, Nico; De Donder, Liesbeth; Buffel, Tine; Dury, Sarah; Deforche, Benedicte

    2013-08-14

    The physical environment may play a crucial role in promoting older adults' walking for transportation. However, previous studies on relationships between the physical environment and older adults' physical activity behaviors have reported inconsistent findings. A possible explanation for these inconsistencies is the focus upon studying environmental factors separately rather than simultaneously. The current study aimed to investigate the cumulative influence of perceived favorable environmental factors on older adults' walking for transportation. Additionally, the moderating effect of perceived distance to destinations on this relationship was studied. The sample was comprised of 50,685 non-institutionalized older adults residing in Flanders (Belgium). Cross-sectional data on demographics, environmental perceptions and frequency of walking for transportation were collected by self-administered questionnaires in the period 2004-2010. Perceived distance to destinations was categorized into short, medium, and large distance to destinations. An environmental index (=a sum of favorable environmental factors, ranging from 0 to 7) was constructed to investigate the cumulative influence of favorable environmental factors. Multilevel logistic regression analyses were applied to predict probabilities of daily walking for transportation. For short distance to destinations, probability of daily walking for transportation was significantly higher when seven compared to three, four or five favorable environmental factors were present. For medium distance to destinations, probabilities significantly increased for an increase from zero to four favorable environmental factors. For large distance to destinations, no relationship between the environmental index and walking for transportation was observed. Our findings suggest that the presence of multiple favorable environmental factors can motivate older adults to walk medium distances to facilities. Future research should focus upon the relationship between older adults' physical activity and multiple environmental factors simultaneously instead of separately.

  15. Cumulative Probability and Time to Reintubation in U.S. ICUs.

    PubMed

    Miltiades, Andrea N; Gershengorn, Hayley B; Hua, May; Kramer, Andrew A; Li, Guohua; Wunsch, Hannah

    2017-05-01

    Reintubation after liberation from mechanical ventilation is viewed as an adverse event in ICUs. We sought to describe the frequency of reintubations across U.S. ICUs and to propose a standard, appropriate time cutoff for reporting of reintubation events. We conducted a cohort study using data from the Project IMPACT database of 185 diverse ICUs in the United States. We included patients who received mechanical ventilation and excluded patients who received a tracheostomy, had a do-not-resuscitate order placed, or died prior to first extubation. We assessed the percentage of patients extubated who were reintubated; the cumulative probability of reintubation, with death and do-not-resuscitate orders after extubation modeled as competing risks, and time to reintubation. Among 98,367 patients who received mechanical ventilation without death or tracheostomy prior to extubation, 9,907 (10.1%) were reintubated, with a cumulative probability of 10.0%. Median time to reintubation was 15 hours (interquartile range, 2-45 hr). Of patients who required reintubation in the ICU, 90% did so within the first 96 hours after initial extubation; this was consistent across various patient subtypes (89.3% for electives surgical patients up to 94.8% for trauma patients) and ICU subtypes (88.6% for cardiothoracic ICUs to 93.5% for medical ICUs). The reintubation rate for ICU patients liberated from mechanical ventilation in U.S. ICUs is approximately 10%. We propose a time cutoff of 96 hours for reintubation definitions and benchmarking efforts, as it captures 90% of ICU reintubation events. Reintubation rates can be reported as simple percentages, without regard for deaths or changes in goals of care that might occur.

  16. Older adults’ transportation walking: a cross-sectional study on the cumulative influence of physical environmental factors

    PubMed Central

    2013-01-01

    Background The physical environment may play a crucial role in promoting older adults’ walking for transportation. However, previous studies on relationships between the physical environment and older adults’ physical activity behaviors have reported inconsistent findings. A possible explanation for these inconsistencies is the focus upon studying environmental factors separately rather than simultaneously. The current study aimed to investigate the cumulative influence of perceived favorable environmental factors on older adults’ walking for transportation. Additionally, the moderating effect of perceived distance to destinations on this relationship was studied. Methods The sample was comprised of 50,685 non-institutionalized older adults residing in Flanders (Belgium). Cross-sectional data on demographics, environmental perceptions and frequency of walking for transportation were collected by self-administered questionnaires in the period 2004-2010. Perceived distance to destinations was categorized into short, medium, and large distance to destinations. An environmental index (=a sum of favorable environmental factors, ranging from 0 to 7) was constructed to investigate the cumulative influence of favorable environmental factors. Multilevel logistic regression analyses were applied to predict probabilities of daily walking for transportation. Results For short distance to destinations, probability of daily walking for transportation was significantly higher when seven compared to three, four or five favorable environmental factors were present. For medium distance to destinations, probabilities significantly increased for an increase from zero to four favorable environmental factors. For large distance to destinations, no relationship between the environmental index and walking for transportation was observed. Conclusions Our findings suggest that the presence of multiple favorable environmental factors can motivate older adults to walk medium distances to facilities. Future research should focus upon the relationship between older adults’ physical activity and multiple environmental factors simultaneously instead of separately. PMID:23945285

  17. Setting Climate Mitigation Targets in the Face of Uncertainty

    NASA Astrophysics Data System (ADS)

    Raupach, M. R.

    2012-12-01

    Uncertainty in climate science is well known, and at least some of it may be irreducible. However, the presence of uncertainty increases the urgency of action rather than reducing it. For the purpose of setting climate targets, earth system models can be construed as giant transfer functions mapping anthropogenic forcings to climate responses. Recent work [Allen et al. 2009, Nature 458, and related papers] has shown that the broad structure of this mapping is captured by a near-linear relationship T = αQ between cumulative CO2 emissions (Q) and global warming (T), both measured from the start of the industrial era. The slope α is about 1.8 K/EgC (67% probability range 1.2 to 2.7), or 1.8 degrees per trillion tonnes of carbon, both for the recent past and also for future projections. Near-linearity occurs because of compensating interactions between CO2 emissions trajectories, emissions of non-CO2 gases, and nonlinear carbon-climate dynamics. The implication is that a all-time quota of about 1100 PgC of carbon can be emitted before T =2 K warming is exceeded, with median (50%) probability. Half of this quota (550 PgC) has been emitted already through the industrial era (since 1750). Accounting for the need to turn around the present growth in emissions, the eventual decline in emissions to meet a target T =2 K has to be at more than 5% per year if mitigation starts immediately [Raupach et al. 2011, Tellus B 63]. With delay, the required rate of decline rapidly rises further. A 50% chance of meeting a target T =2 K is inadequate, because of paleoclimatic evidence for destabilising climate feedbacks in response to small changes in forcing. This evidence calls for either or both of two responses: a tougher target such as T = 1 K, or a higher chance of meeting the target. It is shown here that (1) the cumulative probability distribution of T at given Q is approximately log-normal; and (2) consequently, the cumulative emission Q needed to stay below a warming T with probability P is Q = (T/α)[1 - sqrt(2π)(ln r)(P - ½)], where r is a spread parameter such that 2/3 of the probability mass lies within a factor (1/r, r) of the median. Current uncertainty estimates for the equilibrium climate sensitivity (a best estimate of 3 K per CO2 doubling about a 2/3 probability range of 2 to 4.5 K) are consistent with r = 1.5. With this uncertainty, if P is increased from 50% (median) to 80%, then the all-time Q falls from 1100 to 770 PgC, or 220 PgC from 2012 onward (20 years worth of emissions at current rates). Implications are: (1) as the uncertainty (r) increases with all else fixed, the quota falls; (2) the combination of a warming target T < 2 K with high chance of success is now unreachable. At least for a minimal climate target like T = 2 K with 50% chance of success, the mitigation challenge is still technically possible. However, climate futures will be shaped not only by technology but also by inner human narratives, mental maps and aspirations, including attitudes to risk. The transformation that is needed to meet the climate challenge depends not only on technologies but also on the evolution of self-sustaining narratives.

  18. Optimal screening interval for men with low baseline prostate-specific antigen levels (≤1.0 ng/mL) in a prostate cancer screening program.

    PubMed

    Urata, Satoko; Kitagawa, Yasuhide; Matsuyama, Satoko; Naito, Renato; Yasuda, Kenji; Mizokami, Atsushi; Namiki, Mikio

    2017-04-01

    To optimize the rescreening schedule for men with low baseline prostate-specific antigen (PSA) levels, we evaluated men with baseline PSA levels of ≤1.0 ng/mL in PSA-based population screening. We enrolled 8086 men aged 55-69 years with baseline PSA levels of ≤1.0 ng/mL, who were screened annually. The relationships of baseline PSA and age with the cumulative risks and clinicopathological features of screening-detected cancer were investigated. Among the 8086 participants, 28 (0.35 %) and 18 (0.22 %) were diagnosed with prostate cancer and cancer with a Gleason score (GS) of ≥7 during the observation period, respectively. The cumulative probabilities of prostate cancer at 12 years were 0.42, 1.0, 3.4, and 4.3 % in men with baseline PSA levels of 0.0-0.4, 0.5-0.6, 0.7-0.8, and 0.9-1.0 ng/mL, respectively. Those with GS of ≥7 had cumulative probabilities of 0.42, 0.73, 2.8, and 1.9 %, respectively. The cumulative probabilities of prostate cancer were significantly lower when baseline PSA levels were 0.0-0.6 ng/mL compared with 0.7-1.0 ng/mL. Prostate cancer with a GS of ≥7 was not detected during the first 10 years of screening when baseline PSA levels were 0.0-0.6 ng/mL and was not detected during the first 2 years when baseline PSA levels were 0.7-1.0 ng/mL. Our study demonstrated that men with baseline PSA levels of 0.0-0.6 ng/mL might benefit from longer screening intervals than those recommended in the guidelines of the Japanese Urological Association. Further investigation is needed to confirm the optimal screening interval for men with low baseline PSA levels.

  19. flexsurv: A Platform for Parametric Survival Modeling in R

    PubMed Central

    Jackson, Christopher H.

    2018-01-01

    flexsurv is an R package for fully-parametric modeling of survival data. Any parametric time-to-event distribution may be fitted if the user supplies a probability density or hazard function, and ideally also their cumulative versions. Standard survival distributions are built in, including the three and four-parameter generalized gamma and F distributions. Any parameter of any distribution can be modeled as a linear or log-linear function of covariates. The package also includes the spline model of Royston and Parmar (2002), in which both baseline survival and covariate effects can be arbitrarily flexible parametric functions of time. The main model-fitting function, flexsurvreg, uses the familiar syntax of survreg from the standard survival package (Therneau 2016). Censoring or left-truncation are specified in ‘Surv’ objects. The models are fitted by maximizing the full log-likelihood, and estimates and confidence intervals for any function of the model parameters can be printed or plotted. flexsurv also provides functions for fitting and predicting from fully-parametric multi-state models, and connects with the mstate package (de Wreede, Fiocco, and Putter 2011). This article explains the methods and design principles of the package, giving several worked examples of its use. PMID:29593450

  20. Structural variations in prefrontal cortex mediate the relationship between early childhood stress and spatial working memory

    PubMed Central

    Hanson, Jamie L.; Chung, Moo K.; Avants, Brian B.; Rudolph, Karen D.; Shirtcliff, Elizabeth A.; Gee, James C.; Davidson, Richard J.; Pollak, Seth D.

    2012-01-01

    A large corpus of research indicates exposure to stress impairs cognitive abilities, specifically executive functioning dependent on the prefrontal cortex (PFC). We collected structural MRI scans (n=61), well-validated assessments of executive functioning, and detailed interviews assessing stress exposure in humans, to examine whether cumulative life stress affected brain morphometry and one type of executive functioning, spatial working memory, during adolescence—a critical time of brain development and reorganization. Analysis of variations in brain structure revealed that cumulative life stress and spatial working memory were related to smaller volumes in the PFC, specifically prefrontal gray and white matter between the anterior cingulate and the frontal poles. Mediation analyses revealed that individual differences in prefrontal volumes accounted for the association between cumulative life stress and spatial working memory. These results suggest that structural changes in the PFC may serve as a mediating mechanism through which greater cumulative life stress engenders decrements in cognitive functioning. PMID:22674267

  1. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis

    PubMed Central

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed. PMID:26167524

  2. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.

    PubMed

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  3. A novel method for correcting scanline-observational bias of discontinuity orientation

    PubMed Central

    Huang, Lei; Tang, Huiming; Tan, Qinwen; Wang, Dingjian; Wang, Liangqing; Ez Eldin, Mutasim A. M.; Li, Changdong; Wu, Qiong

    2016-01-01

    Scanline observation is known to introduce an angular bias into the probability distribution of orientation in three-dimensional space. In this paper, numerical solutions expressing the functional relationship between the scanline-observational distribution (in one-dimensional space) and the inherent distribution (in three-dimensional space) are derived using probability theory and calculus under the independence hypothesis of dip direction and dip angle. Based on these solutions, a novel method for obtaining the inherent distribution (also for correcting the bias) is proposed, an approach which includes two procedures: 1) Correcting the cumulative probabilities of orientation according to the solutions, and 2) Determining the distribution of the corrected orientations using approximation methods such as the one-sample Kolmogorov-Smirnov test. The inherent distribution corrected by the proposed method can be used for discrete fracture network (DFN) modelling, which is applied to such areas as rockmass stability evaluation, rockmass permeability analysis, rockmass quality calculation and other related fields. To maximize the correction capacity of the proposed method, the observed sample size is suggested through effectiveness tests for different distribution types, dispersions and sample sizes. The performance of the proposed method and the comparison of its correction capacity with existing methods are illustrated with two case studies. PMID:26961249

  4. An efficient distribution method for nonlinear transport problems in stochastic porous media

    NASA Astrophysics Data System (ADS)

    Ibrahima, F.; Tchelepi, H.; Meyer, D. W.

    2015-12-01

    Because geophysical data are inexorably sparse and incomplete, stochastic treatments of simulated responses are convenient to explore possible scenarios and assess risks in subsurface problems. In particular, understanding how uncertainties propagate in porous media with nonlinear two-phase flow is essential, yet challenging, in reservoir simulation and hydrology. We give a computationally efficient and numerically accurate method to estimate the one-point probability density (PDF) and cumulative distribution functions (CDF) of the water saturation for the stochastic Buckley-Leverett problem when the probability distributions of the permeability and porosity fields are available. The method draws inspiration from the streamline approach and expresses the distributions of interest essentially in terms of an analytically derived mapping and the distribution of the time of flight. In a large class of applications the latter can be estimated at low computational costs (even via conventional Monte Carlo). Once the water saturation distribution is determined, any one-point statistics thereof can be obtained, especially its average and standard deviation. Moreover, rarely available in other approaches, yet crucial information such as the probability of rare events and saturation quantiles (e.g. P10, P50 and P90) can be derived from the method. We provide various examples and comparisons with Monte Carlo simulations to illustrate the performance of the method.

  5. Ozone-surface interactions: Investigations of mechanisms, kinetics, mass transport, and implications for indoor air quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrison, Glenn Charles

    1999-12-01

    In this dissertation, results are presented of laboratory investigations and mathematical modeling efforts designed to better understand the interactions of ozone with surfaces. In the laboratory, carpet and duct materials were exposed to ozone and measured ozone uptake kinetics and the ozone induced emissions of volatile organic compounds. To understand the results of the experiments, mathematical methods were developed to describe dynamic indoor aldehyde concentrations, mass transport of reactive species to smooth surfaces, the equivalent reaction probability of whole carpet due to the surface reactivity of fibers and carpet backing, and ozone aging of surfaces. Carpets, separated carpet fibers, andmore » separated carpet backing all tended to release aldehydes when exposed to ozone. Secondary emissions were mostly n-nonanal and several other smaller aldehydes. The pattern of emissions suggested that vegetable oils may be precursors for these oxidized emissions. Several possible precursors and experiments in which linseed and tung oils were tested for their secondary emission potential were discussed. Dynamic emission rates of 2-nonenal from a residential carpet may indicate that intermediate species in the oxidation of conjugated olefins can significantly delay aldehyde emissions and act as reservoir for these compounds. The ozone induced emission rate of 2-nonenal, a very odorous compound, can result in odorous indoor concentrations for several years. Surface ozone reactivity is a key parameter in determining the flux of ozone to a surface, is parameterized by the reaction probability, which is simply the probability that an ozone molecule will be irreversibly consumed when it strikes a surface. In laboratory studies of two residential and two commercial carpets, the ozone reaction probability for carpet fibers, carpet backing and the equivalent reaction probability for whole carpet were determined. Typically reaction probability values for these materials were 10 -7, 10 -5, and 10 -5 respectively. To understand how internal surface area influences the equivalent reaction probability of whole carpet, a model of ozone diffusion into and reaction with internal carpet components was developed. This was then used to predict apparent reaction probabilities for carpet. He combines this with a modified model of turbulent mass transfer developed by Liu, et al. to predict deposition rates and indoor ozone concentrations. The model predicts that carpet should have an equivalent reaction probability of about 10 -5, matching laboratory measurements of the reaction probability. For both carpet and duct materials, surfaces become progressively quenched (aging), losing the ability to react or otherwise take up ozone. He evaluated the functional form of aging and find that the reaction probability follows a power function with respect to the cumulative uptake of ozone. To understand ozone aging of surfaces, he developed several mathematical descriptions of aging based on two different mechanisms. The observed functional form of aging is mimicked by a model which describes ozone diffusion with internal reaction in a solid. He shows that the fleecy nature of carpet materials in combination with the model of ozone diffusion below a fiber surface and internal reaction may explain the functional form and the magnitude of power function parameters observed due to ozone interactions with carpet. The ozone induced aldehyde emissions, measured from duct materials, were combined with an indoor air quality model to show that concentrations of aldehydes indoors may approach odorous levels. He shows that ducts are unlikely to be a significant sink for ozone due to the low reaction probability in combination with the short residence time of air in ducts.« less

  6. Retrogressive hydration of calc-silicate xenoliths in the eastern Bushveld complex: evidence for late magmatic fluid movement

    NASA Astrophysics Data System (ADS)

    Wallmach, T.; Hatton, C. J.; De Waal, S. A.; Gibson, R. L.

    1995-11-01

    Two calc-silicate xenoliths in the Upper Zone of the Bushveld complex contain mineral assemblages which permit delineation of the metamorphic path followed after incorporation of the xenoliths into the magma. Peak metamorphism in these xenoliths occurred at T=1100-1200°C and P <1.5 kbar. Retrograde metamorphism, probably coinciding with the late magmatic stage, is characterized by the breakdown of akermanite to monticellite and wollastonite at 700°C and the growth of vesuvianite from melilite. The latter implies that water-rich fluids (X CO 2 <0.2) were present and probably circulating through the cooling magmatic pile. In contrast, calc-silicate xenoliths within the lower zones of the Bushveld complex, namely in the Marginal and Critical Zones, also contain melilite, monticellite and additional periclase with only rare development of vesuvianite. This suggests that the Upper Zone cumulate pile was much 'wetter' in the late-magmatic stage than the earlier-formed Critical and Marginal Zone cumulate piles.

  7. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  8. Revealing modified gravity signals in matter and halo hierarchical clustering

    NASA Astrophysics Data System (ADS)

    Hellwing, Wojciech A.; Koyama, Kazuya; Bose, Benjamin; Zhao, Gong-Bo

    2017-07-01

    We use a set of N-body simulations employing a modified gravity (MG) model with Vainshtein screening to study matter and halo hierarchical clustering. As test-case scenarios we consider two normal branch Dvali-Gabadadze-Porrati (nDGP) gravity models with mild and strong growth rate enhancement. We study higher-order correlation functions ξn(R ) up to n =9 and associated reduced cumulants Sn(R )≡ξn(R )/σ (R )2n -2. We find that the matter probability distribution functions are strongly affected by the fifth force on scales up to 50 h-1 Mpc , and the deviations from general relativity (GR) are maximized at z =0 . For reduced cumulants Sn, we find that at small scales R ≤6 h-1 Mpc the MG is characterized by lower values, with the deviation growing from 7% in the reduced skewness up to even 40% in S5. To study the halo clustering we use a simple abundance matching and divide haloes into thee fixed number density samples. The halo two-point functions are weakly affected, with a relative boost of the order of a few percent appearing only at the smallest pair separations (r ≤5 h-1 Mpc ). In contrast, we find a strong MG signal in Sn(R )'s, which are enhanced compared to GR. The strong model exhibits a >3 σ level signal at various scales for all halo samples and in all cumulants. In this context, we find that the reduced kurtosis to be an especially promising cosmological probe of MG. Even the mild nDGP model leaves a 3 σ imprint at small scales R ≤3 h-1 Mpc , while the stronger model deviates from a GR signature at nearly all scales with a significance of >5 σ . Since the signal is persistent in all halo samples and over a range of scales, we advocate that the reduced kurtosis estimated from galaxy catalogs can potentially constitute a strong MG-model discriminatory as well as GR self-consistency test.

  9. A Time-Dependent Quantum Dynamics Study of the H2 + CH3 yields H + CH4 Reaction

    NASA Technical Reports Server (NTRS)

    Wang, Dunyou; Kwak, Dochan (Technical Monitor)

    2002-01-01

    We present a time-dependent wave-packet propagation calculation for the H2 + CH3 yields H + CH4 reaction in six degrees of freedom and for zero total angular momentum. Initial state selected reaction probability for different initial rotational-vibrational states are presented in this study. The cumulative reaction probability (CRP) is obtained by summing over initial-state-selected reaction probability. The energy-shift approximation to account for the contribution of degrees of freedom missing in the 6D calculation is employed to obtain an approximate full-dimensional CRP. Thermal rate constant is compared with different experiment results.

  10. Risk factors for corneal infiltrative events during continuous wear of silicone hydrogel contact lenses.

    PubMed

    Szczotka-Flynn, Loretta; Lass, Jonathan H; Sethi, Ajay; Debanne, Sara; Benetz, Beth Ann; Albright, Matthew; Gillespie, Beth; Kuo, Jana; Jacobs, Michael R; Rimm, Alfred

    2010-11-01

    This study determined which microbiologic, clinical, demographic, and behavioral factors are associated with corneal infiltrative events (CIEs) during continuous wear of silicone hydrogel (SH) contact lenses. Subjects (n = 205) were fitted with lotrafilcon A lenses for continuous wear and observed for 1 year. The main exposures of interest were corneal staining and bacterial lens contamination. Kaplan-Meier (KM) plots were used to estimate the cumulative unadjusted probability of remaining CIE free, and Cox proportional hazards regression was used to model the hazard of having a CIE, as a function of key predictor variables. The KM-unadjusted cumulative probability of remaining CIE free was 73.3%. Approximately 53% of subjects had repeated episodes of corneal staining (mild or greater), and 11.3% had repeated episodes of moderate or greater corneal staining. Corneal staining was not associated with the development of a CIE. The frequency of substantial bacterial bioburden on worn lenses at the time of a CIE was 64.7%, compared with only 12.2% during uncomplicated wear. The presence of substantial lens bacterial bioburden was associated with the development of a CIE (adjusted hazards ratio [HR], 8.66; 95% confidence interval [CI], 2.88-26.01). Smoking was also associated with a CIE (adjusted HR, 4.13; 95% CI, 1.27-13.45). Corneal staining is common during continuous wear of SH lenses, but it is not associated with the development of a CIE. Smoking and substantial lens bacterial bioburden pose prominent risks of a CIE. In this study, more than 70% of the total risk of CIE in those with substantial lens bioburden is attributable to this exposure. (ClinicalTrials.gov number, NCT00727402).

  11. Long-Term Evaluation of Biotronik Linox and Linox(smart) Implantable Cardioverter Defibrillator Leads.

    PubMed

    Good, Eric D; Cakulev, Ivan; Orlov, Michael V; Hirsh, David; Simeles, John; Mohr, Kelly; Moll, Phil; Bloom, Heather

    2016-06-01

    Expert consensus holds that post-market, systematic surveillance of ICD leads is essential to ensure confirmation of adequate lead performance. GALAXY (NCT00836589) and CELESTIAL (NCT00810264) are ongoing multicenter, prospective, non-randomized registries conducted to confirm the long-term safety and reliability of Biotronik leads. ICD and CRT-D patients are followed for Linox and Linox(smart) ICD lead performance and safety for 5 years post-implant. All procedural and system-related adverse events (AEs) were assessed at each follow-up, along with lead electrical parameters. An independent CEC of EPs adjudicated AEs to determine AE category and lead relatedness. The analysis used categories of lead observations per ISO 5841-2 (Third edition). A total of 3,933 leads were implanted in 3,840 patients (73.0% male, mean age 67.0 ± 12.2 years) at 146 US centers. The estimated cumulative survival probability was 96.3% at 5 years after implant for Linox leads and 96.6% at 4 years after implant for Linox(smart) leads. A comparison of the Linox and Linox(smart) survival functions did not find evidence of a difference (P = 0.2155). The most common AEs were oversensing (23, 0.58%), conductor fracture (14, 0.36%), failure to capture (13, 0.33%), lead dislodgement (12, 0.31%), insulation breach (10, 0.25%), and abnormal pacing impedance (8, 0.20%). Linox and Linox(smart) ICD leads are safe, reliable and infrequently associated with lead-related AEs. Additionally, estimated cumulative survival probability is clinically acceptable and well within industry standards. Ongoing data collection will confirm the longer-term safety and performance of the Linox family of ICD leads. © 2016 Wiley Periodicals, Inc.

  12. Risk Factors for Corneal Infiltrative Events during Continuous Wear of Silicone Hydrogel Contact Lenses

    PubMed Central

    Lass, Jonathan H.; Sethi, Ajay; Debanne, Sara; Benetz, Beth Ann; Albright, Matthew; Gillespie, Beth; Kuo, Jana; Jacobs, Michael R.; Rimm, Alfred

    2010-01-01

    Purpose. This study determined which microbiologic, clinical, demographic, and behavioral factors are associated with corneal infiltrative events (CIEs) during continuous wear of silicone hydrogel (SH) contact lenses. Methods. Subjects (n = 205) were fitted with lotrafilcon A lenses for continuous wear and observed for 1 year. The main exposures of interest were corneal staining and bacterial lens contamination. Kaplan-Meier (KM) plots were used to estimate the cumulative unadjusted probability of remaining CIE free, and Cox proportional hazards regression was used to model the hazard of having a CIE, as a function of key predictor variables. Results. The KM-unadjusted cumulative probability of remaining CIE free was 73.3%. Approximately 53% of subjects had repeated episodes of corneal staining (mild or greater), and 11.3% had repeated episodes of moderate or greater corneal staining. Corneal staining was not associated with the development of a CIE. The frequency of substantial bacterial bioburden on worn lenses at the time of a CIE was 64.7%, compared with only 12.2% during uncomplicated wear. The presence of substantial lens bacterial bioburden was associated with the development of a CIE (adjusted hazards ratio [HR], 8.66; 95% confidence interval [CI], 2.88–26.01). Smoking was also associated with a CIE (adjusted HR, 4.13; 95% CI, 1.27–13.45). Conclusions. Corneal staining is common during continuous wear of SH lenses, but it is not associated with the development of a CIE. Smoking and substantial lens bacterial bioburden pose prominent risks of a CIE. In this study, more than 70% of the total risk of CIE in those with substantial lens bioburden is attributable to this exposure. (ClinicalTrials.gov number, NCT00727402). PMID:20538985

  13. Development of visual field defect after first-detected optic disc hemorrhage in preperimetric open-angle glaucoma.

    PubMed

    Kim, Hae Jin; Song, Yong Ju; Kim, Young Kook; Jeoung, Jin Wook; Park, Ki Ho

    2017-07-01

    To evaluate functional progression in preperimetric glaucoma (PPG) with disc hemorrhage (DH) and to determine the time interval between the first-detected DH and development of glaucomatous visual field (VF) defect. A total of 87 patients who had been first diagnosed with PPG were enrolled. The medical records of PPG patients without DH (Group 1) and with DH (Group 2) were reviewed. When glaucomatous VF defect appeared, the time interval from the diagnosis of PPG to the development of VF defect was calculated and compared between the two groups. In group 2, the time intervals from the first-detected DH to VF defect of the single- and recurrent-DH were compared. Of the enrolled patients, 45 had DH in the preperimetric stage. The median time interval from the diagnosis of PPG to the development of VF defect was 73.3 months in Group 1, versus 45.4 months in Group 2 (P = 0.042). The cumulative probability of development of VF defect after diagnosis of PPG was significantly greater in Group 2 than in Group 1. The median time interval from first-detected DH to the development of VF defect was 37.8 months. The median time interval from DH to VF defect and cumulative probability of VF defect after DH did not show a statistical difference between single and recurrent-DH patients. The median time interval between the diagnosis of PPG and the development of VF defect was significantly shorter in PPG with DH. The VF defect appeared 37.8 months after the first-detected DH in PPG.

  14. Surface slip during large Owens Valley earthquakes

    NASA Astrophysics Data System (ADS)

    Haddon, E. K.; Amos, C. B.; Zielke, O.; Jayko, A. S.; Bürgmann, R.

    2016-06-01

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from ˜1.0 to 6.0 m and average 3.3 ± 1.1 m (2σ). Vertical offsets are predominantly east-down between ˜0.1 and 2.4 m, with a mean of 0.8 ± 0.5 m. The average lateral-to-vertical ratio compiled at specific sites is ˜6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7-11 m and net average of 4.4 ± 1.5 m, corresponding to a geologic Mw ˜7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.1 ± 2.0 m, 12.8 ± 1.5 m, and 16.6 ± 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between ˜0.6 and 1.6 mm/yr (1σ) over the late Quaternary.

  15. Ipsilateral femoral autograft reconstruction after resection of a pelvic tumor.

    PubMed

    Biau, David J; Thévenin, Fabrice; Dumaine, Valérie; Babinet, Antoine; Tomeno, Bernard; Anract, Philippe

    2009-01-01

    Reconstruction of bone after the resection of a pelvic tumor is challenging. The purpose of the present study was to evaluate the use of the ipsilateral femur as the graft material for reconstruction. We performed a retrospective review of thirteen patients with a malignant pelvic lesion who underwent resection followed by reconstruction with an ipsilateral femoral autograft and insertion of a total hip replacement. The study group included nine men and four women with a median age of fifty-one years at the time of the reconstruction. The diagnosis was chondrosarcoma in eight patients, metastasis in three, and myeloma and radiation-induced malignant disease in one each. The surviving patients were assessed functionally and radiographically; the cumulative probability of revision was estimated while taking into account competing risks. The median duration of follow-up was forty-nine months. At the time of the latest follow-up, seven patients were alive and disease-free and six had died from metastatic disease. Four patients had had revision of the reconstruction, two for the treatment of mechanical complications and two for the treatment of infection. Three other patients had mechanical complications but had not had a revision. The cumulative probability of revision of the reconstruction for mechanical failure was 8% (95% confidence interval, 0% to 23%), 8% (95% confidence interval, 0% to 23%), and 16% (95% confidence interval, 0% to 39%) at one, two, and four years, respectively. Although it has attendant complications consistent with pelvic tumor surgery, an ipsilateral femoral autograft reconstruction may be an option for reconstruction of pelvic discontinuity in a subgroup of patients following tumor resection. This innovative procedure requires longer-term follow-up studies.

  16. Evaluation of tranche in securitization and long-range Ising model

    NASA Astrophysics Data System (ADS)

    Kitsukawa, K.; Mori, S.; Hisakado, M.

    2006-08-01

    This econophysics work studies the long-range Ising model of a finite system with N spins and the exchange interaction J/N and the external field H as a model for homogeneous credit portfolio of assets with default probability Pd and default correlation ρd. Based on the discussion on the (J,H) phase diagram, we develop a perturbative calculation method for the model and obtain explicit expressions for Pd,ρd and the normalization factor Z in terms of the model parameters N and J,H. The effect of the default correlation ρd on the probabilities P(Nd,ρd) for Nd defaults and on the cumulative distribution function D(i,ρd) are discussed. The latter means the average loss rate of the“tranche” (layered structure) of the securities (e.g. CDO), which are synthesized from a pool of many assets. We show that the expected loss rate of the subordinated tranche decreases with ρd and that of the senior tranche increases linearly, which are important in their pricing and ratings.

  17. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Technical Reports Server (NTRS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  18. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Astrophysics Data System (ADS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  19. White-nose syndrome pathology grading in Nearctic and Palearctic bats

    PubMed Central

    Pikula, Jiri; Amelon, Sybill K.; Bandouchova, Hana; Bartonička, Tomáš; Berkova, Hana; Brichta, Jiri; Hooper, Sarah; Kokurewicz, Tomasz; Kolarik, Miroslav; Köllner, Bernd; Kovacova, Veronika; Linhart, Petr; Piacek, Vladimir; Turner, Gregory G.; Zukal, Jan; Martínková, Natália

    2017-01-01

    While white-nose syndrome (WNS) has decimated hibernating bat populations in the Nearctic, species from the Palearctic appear to cope better with the fungal skin infection causing WNS. This has encouraged multiple hypotheses on the mechanisms leading to differential survival of species exposed to the same pathogen. To facilitate intercontinental comparisons, we proposed a novel pathogenesis-based grading scheme consistent with WNS diagnosis histopathology criteria. UV light-guided collection was used to obtain single biopsies from Nearctic and Palearctic bat wing membranes non-lethally. The proposed scheme scores eleven grades associated with WNS on histopathology. Given weights reflective of grade severity, the sum of findings from an individual results in weighted cumulative WNS pathology score. The probability of finding fungal skin colonisation and single, multiple or confluent cupping erosions increased with increase in Pseudogymnoascus destructans load. Increasing fungal load mimicked progression of skin infection from epidermal surface colonisation to deep dermal invasion. Similarly, the number of UV-fluorescent lesions increased with increasing weighted cumulative WNS pathology score, demonstrating congruence between WNS-associated tissue damage and extent of UV fluorescence. In a case report, we demonstrated that UV-fluorescence disappears within two weeks of euthermy. Change in fluorescence was coupled with a reduction in weighted cumulative WNS pathology score, whereby both methods lost diagnostic utility. While weighted cumulative WNS pathology scores were greater in the Nearctic than Palearctic, values for Nearctic bats were within the range of those for Palearctic species. Accumulation of wing damage probably influences mortality in affected bats, as demonstrated by a fatal case of Myotis daubentonii with natural WNS infection and healing in Myotis myotis. The proposed semi-quantitative pathology score provided good agreement between experienced raters, showing it to be a powerful and widely applicable tool for defining WNS severity. PMID:28767673

  20. White-nose syndrome pathology grading in Nearctic and Palearctic bats.

    PubMed

    Pikula, Jiri; Amelon, Sybill K; Bandouchova, Hana; Bartonička, Tomáš; Berkova, Hana; Brichta, Jiri; Hooper, Sarah; Kokurewicz, Tomasz; Kolarik, Miroslav; Köllner, Bernd; Kovacova, Veronika; Linhart, Petr; Piacek, Vladimir; Turner, Gregory G; Zukal, Jan; Martínková, Natália

    2017-01-01

    While white-nose syndrome (WNS) has decimated hibernating bat populations in the Nearctic, species from the Palearctic appear to cope better with the fungal skin infection causing WNS. This has encouraged multiple hypotheses on the mechanisms leading to differential survival of species exposed to the same pathogen. To facilitate intercontinental comparisons, we proposed a novel pathogenesis-based grading scheme consistent with WNS diagnosis histopathology criteria. UV light-guided collection was used to obtain single biopsies from Nearctic and Palearctic bat wing membranes non-lethally. The proposed scheme scores eleven grades associated with WNS on histopathology. Given weights reflective of grade severity, the sum of findings from an individual results in weighted cumulative WNS pathology score. The probability of finding fungal skin colonisation and single, multiple or confluent cupping erosions increased with increase in Pseudogymnoascus destructans load. Increasing fungal load mimicked progression of skin infection from epidermal surface colonisation to deep dermal invasion. Similarly, the number of UV-fluorescent lesions increased with increasing weighted cumulative WNS pathology score, demonstrating congruence between WNS-associated tissue damage and extent of UV fluorescence. In a case report, we demonstrated that UV-fluorescence disappears within two weeks of euthermy. Change in fluorescence was coupled with a reduction in weighted cumulative WNS pathology score, whereby both methods lost diagnostic utility. While weighted cumulative WNS pathology scores were greater in the Nearctic than Palearctic, values for Nearctic bats were within the range of those for Palearctic species. Accumulation of wing damage probably influences mortality in affected bats, as demonstrated by a fatal case of Myotis daubentonii with natural WNS infection and healing in Myotis myotis. The proposed semi-quantitative pathology score provided good agreement between experienced raters, showing it to be a powerful and widely applicable tool for defining WNS severity.

  1. The distribution of genome shared identical by descent for a pair of full sibs by means of the continuous time Markov chain

    NASA Astrophysics Data System (ADS)

    Julie, Hongki; Pasaribu, Udjianna S.; Pancoro, Adi

    2015-12-01

    This paper will allow Markov Chain's application in genome shared identical by descent by two individual at full sibs model. The full sibs model was a continuous time Markov Chain with three state. In the full sibs model, we look for the cumulative distribution function of the number of sub segment which have 2 IBD haplotypes from a segment of the chromosome which the length is t Morgan and the cumulative distribution function of the number of sub segment which have at least 1 IBD haplotypes from a segment of the chromosome which the length is t Morgan. This cumulative distribution function will be developed by the moment generating function.

  2. Blind channel estimation and deconvolution in colored noise using higher-order cumulants

    NASA Astrophysics Data System (ADS)

    Tugnait, Jitendra K.; Gummadavelli, Uma

    1994-10-01

    Existing approaches to blind channel estimation and deconvolution (equalization) focus exclusively on channel or inverse-channel impulse response estimation. It is well-known that the quality of the deconvolved output depends crucially upon the noise statistics also. Typically it is assumed that the noise is white and the signal-to-noise ratio is known. In this paper we remove these restrictions. Both the channel impulse response and the noise model are estimated from the higher-order (fourth, e.g.) cumulant function and the (second-order) correlation function of the received data via a least-squares cumulant/correlation matching criterion. It is assumed that the noise higher-order cumulant function vanishes (e.g., Gaussian noise, as is the case for digital communications). Consistency of the proposed approach is established under certain mild sufficient conditions. The approach is illustrated via simulation examples involving blind equalization of digital communications signals.

  3. Functions of cumulative distribution of attenuation due to rain on an interval from 9.5 Km A to 17.8 GHz

    NASA Technical Reports Server (NTRS)

    Fedi, F.; Migliorini, P.

    1981-01-01

    Measurement results of attenuation due to rain are reported. Cumulative distribution functions of the attenuation found in three connections are described. Differences between the distribution functions and different polarization frequencies are demonstrated. The possibilty of establishing a bond between the statistics of annual attenuation and worst month attenuation is explored.

  4. Prospect theory reflects selective allocation of attention.

    PubMed

    Pachur, Thorsten; Schulte-Mecklenbeck, Michael; Murphy, Ryan O; Hertwig, Ralph

    2018-02-01

    There is a disconnect in the literature between analyses of risky choice based on cumulative prospect theory (CPT) and work on predecisional information processing. One likely reason is that for expectation models (e.g., CPT), it is often assumed that people behaved only as if they conducted the computations leading to the predicted choice and that the models are thus mute regarding information processing. We suggest that key psychological constructs in CPT, such as loss aversion and outcome and probability sensitivity, can be interpreted in terms of attention allocation. In two experiments, we tested hypotheses about specific links between CPT parameters and attentional regularities. Experiment 1 used process tracing to monitor participants' predecisional attention allocation to outcome and probability information. As hypothesized, individual differences in CPT's loss-aversion, outcome-sensitivity, and probability-sensitivity parameters (estimated from participants' choices) were systematically associated with individual differences in attention allocation to outcome and probability information. For instance, loss aversion was associated with the relative attention allocated to loss and gain outcomes, and a more strongly curved weighting function was associated with less attention allocated to probabilities. Experiment 2 manipulated participants' attention to losses or gains, causing systematic differences in CPT's loss-aversion parameter. This result indicates that attention allocation can to some extent cause choice regularities that are captured by CPT. Our findings demonstrate an as-if model's capacity to reflect characteristics of information processing. We suggest that the observed CPT-attention links can be harnessed to inform the development of process models of risky choice. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. Cumulative Stress and Cortisol Disruption among Black and Hispanic Pregnant Women in an Urban Cohort.

    PubMed

    Suglia, Shakira Franco; Staudenmayer, John; Cohen, Sheldon; Enlow, Michelle Bosquet; Rich-Edwards, Janet W; Wright, Rosalind J

    2010-12-01

    While adult hypothalamic-pituitary-adrenocortical (HPA) axis functioning is thought to be altered by traumatic experiences, little data exist on the effects of cumulative stress on HPA functioning among pregnant women or among specific racial and ethnic groups. Individuals may be increasingly vulnerable to physiological alterations when experiencing cumulative effects of multiple stressors. These effects may be particularly relevant in urban poor communities where exposure to multiple stressors is more prevalent. The goal of this study was to explore the effects of multiple social stressors on HPA axis functioning in a sample of urban Black (n = 68) and Hispanic (n = 132) pregnant women enrolled in the Asthma Coalition on Community, Environment, and Social Stress (ACCESS). Pregnant women were administered the Revised Conflict Tactics Scale (R-CTS) survey to assess interpersonal violence, the Experiences of Discrimination (EOD) survey, the Crisis in Family Systems-Revised (CRISYS-R) negative life events survey, and the My Exposure to Violence (ETV) survey, which ascertains exposure to community violence. A cumulative stress measure was derived from these instruments. Salivary cortisol samples were collected five times per day over three days to assess area under the curve (AUC), morning change, and basal awakening response in order to characterize diurnal salivary cortisol patterns. Repeated measures mixed models, stratified by race/ethnicity, were performed adjusting for education level, age, smoking status, body mass index and weeks pregnant at time of cortisol sampling. The majority of Hispanic participants (57%) had low cumulative stress exposure, while the majority of Black participants had intermediate (35%) or high (41%) cumulative stress exposure. Results showed that among Black but not Hispanic women, cumulative stress was associated with lower morning cortisol levels, including a flatter waking to bedtime rhythm. These analyses suggest that the combined effects of cumulative stressful experiences are associated with disrupted HPA functioning among pregnant women. While the etiology of racial/ethnic differences in stress-induced HPA alterations is not clear, this warrants further research.

  6. A rapid local singularity analysis algorithm with applications

    NASA Astrophysics Data System (ADS)

    Chen, Zhijun; Cheng, Qiuming; Agterberg, Frits

    2015-04-01

    The local singularity model developed by Cheng is fast gaining popularity in characterizing mineralization and detecting anomalies of geochemical, geophysical and remote sensing data. However in one of the conventional algorithms involving the moving average values with different scales is time-consuming especially while analyzing a large dataset. Summed area table (SAT), also called as integral image, is a fast algorithm used within the Viola-Jones object detection framework in computer vision area. Historically, the principle of SAT is well-known in the study of multi-dimensional probability distribution functions, namely in computing 2D (or ND) probabilities (area under the probability distribution) from the respective cumulative distribution functions. We introduce SAT and it's variation Rotated Summed Area Table in the isotropic, anisotropic or directional local singularity mapping in this study. Once computed using SAT, any one of the rectangular sum can be computed at any scale or location in constant time. The area for any rectangular region in the image can be computed by using only 4 array accesses in constant time independently of the size of the region; effectively reducing the time complexity from O(n) to O(1). New programs using Python, Julia, matlab and C++ are implemented respectively to satisfy different applications, especially to the big data analysis. Several large geochemical and remote sensing datasets are tested. A wide variety of scale changes (linear spacing or log spacing) for non-iterative or iterative approach are adopted to calculate the singularity index values and compare the results. The results indicate that the local singularity analysis with SAT is more robust and superior to traditional approach in identifying anomalies.

  7. EVALUATING CUMULATIVE EFFECTS OF DISTURBANCE ON THE HYDROLOGIC FUNCTION OF BOGS, FENS, AND MIRES

    EPA Science Inventory

    Few quantitative studies have been done on the hydrology of fens, bogs and mires, and consequently any predictions of the cumulative impacts of disturbances on their hydrologic functions is extremely difficult. or example, few data are available on the role of bogs and fens with ...

  8. Intraseasonal variation in survival and probable causes of mortality in greater sage-grouse Centrocercus urophasianus

    USGS Publications Warehouse

    Blomberg, Erik J.; Gibson, Daniel; Sedinger, James S.; Casazza, Michael L.; Coates, Peter S.

    2013-01-01

    The mortality process is a key component of avian population dynamics, and understanding factors that affect mortality is central to grouse conservation. Populations of greater sage-grouse Centrocercus urophasianus have declined across their range in western North America. We studied cause-specific mortality of radio-marked sage-grouse in Eureka County, Nevada, USA, during two seasons, nesting (2008-2012) and fall (2008-2010), when survival was known to be lower compared to other times of the year. We used known-fate and cumulative incidence function models to estimate weekly survival rates and cumulative risk of cause-specific mortalities, respectively. These methods allowed us to account for temporal variation in sample size and staggered entry of marked individuals into the sample to obtain robust estimates of survival and cause-specific mortality. We monitored 376 individual sage-grouse during the course of our study, and investigated 87 deaths. Predation was the major source of mortality, and accounted for 90% of all mortalities during our study. During the nesting season (1 April - 31 May), the cumulative risk of predation by raptors (0.10; 95% CI: 0.05-0.16) and mammals (0.08; 95% CI: 0.03-013) was relatively equal. In the fall (15 August - 31 October), the cumulative risk of mammal predation was greater (M(mam) = 0.12; 95% CI: 0.04-0.19) than either predation by raptors (M(rap) = 0.05; 95% CI: 0.00-0.10) or hunting harvest (M(hunt) = 0.02; 95% CI: 0.0-0.06). During both seasons, we observed relatively few additional sources of mortality (e.g. collision) and observed no evidence of disease-related mortality (e.g. West Nile Virus). In general, we found little evidence for intraseasonal temporal variation in survival, suggesting that the nesting and fall seasons represent biologically meaningful time intervals with respect to sage-grouse survival.

  9. Fitting observed and theoretical choices - women's choices about prenatal diagnosis of Down syndrome.

    PubMed

    Seror, Valerie

    2008-05-01

    Choices regarding prenatal diagnosis of Down syndrome - the most frequent chromosomal defect - are particularly relevant to decision analysis, since women's decisions are based on the assessment of their risk of carrying a child with Down syndrome, and involve tradeoffs (giving birth to an affected child vs procedure-related miscarriage). The aim of this study, based on face-to-face interviews with 78 women aged 25-35 with prior experience of pregnancy, was to compare the women' expressed choices towards prenatal diagnosis with those derived from theoretical models of choice (expected utility theory, rank-dependent theory, and cumulative prospect theory). The main finding obtained in this study was that the cumulative prospect model fitted the observed choices best: both subjective transformation of probabilities and loss aversion, which are basic features of the cumulative prospect model, have to be taken into account to make the observed choices consistent with the theoretical ones.

  10. Probability and surprisal in auditory comprehension of morphologically complex words.

    PubMed

    Balling, Laura Winther; Baayen, R Harald

    2012-10-01

    Two auditory lexical decision experiments document for morphologically complex words two points at which the probability of a target word given the evidence shifts dramatically. The first point is reached when morphologically unrelated competitors are no longer compatible with the evidence. Adapting terminology from Marslen-Wilson (1984), we refer to this as the word's initial uniqueness point (UP1). The second point is the complex uniqueness point (CUP) introduced by Balling and Baayen (2008), at which morphologically related competitors become incompatible with the input. Later initial as well as complex uniqueness points predict longer response latencies. We argue that the effects of these uniqueness points arise due to the large surprisal (Levy, 2008) carried by the phonemes at these uniqueness points, and provide independent evidence that how cumulative surprisal builds up in the course of the word co-determines response latencies. The presence of effects of surprisal, both at the initial uniqueness point of complex words, and cumulatively throughout the word, challenges the Shortlist B model of Norris and McQueen (2008), and suggests that a Bayesian approach to auditory comprehension requires complementation from information theory in order to do justice to the cognitive cost of updating probability distributions over lexical candidates. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  12. Elastic Backbone Defines a New Transition in the Percolation Model

    NASA Astrophysics Data System (ADS)

    Sampaio Filho, Cesar I. N.; Andrade, José S.; Herrmann, Hans J.; Moreira, André A.

    2018-04-01

    The elastic backbone is the set of all shortest paths. We found a new phase transition at peb above the classical percolation threshold at which the elastic backbone becomes dense. At this transition in 2D, its fractal dimension is 1.750 ±0.003 , and one obtains a novel set of critical exponents βeb=0.50 ±0.02 , γeb=1.97 ±0.05 , and νeb=2.00 ±0.02 , fulfilling consistent critical scaling laws. Interestingly, however, the hyperscaling relation is violated. Using Binder's cumulant, we determine, with high precision, the critical probabilities peb for the triangular and tilted square lattice for site and bond percolation. This transition describes a sudden rigidification as a function of density when stretching a damaged tissue.

  13. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  14. Slant path rain attenuation and path diversity statistics obtained through radar modeling of rain structure

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1984-01-01

    Single and joint terminal slant path attenuation statistics at frequencies of 28.56 and 19.04 GHz have been derived, employing a radar data base obtained over a three-year period at Wallops Island, VA. Statistics were independently obtained for path elevation angles of 20, 45, and 90 deg for purposes of examining how elevation angles influences both single-terminal and joint probability distributions. Both diversity gains and autocorrelation function dependence on site spacing and elevation angles were determined employing the radar modeling results. Comparisons with other investigators are presented. An independent path elevation angle prediction technique was developed and demonstrated to fit well with the radar-derived single and joint terminal radar-derived cumulative fade distributions at various elevation angles.

  15. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    NASA Astrophysics Data System (ADS)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  16. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    PubMed

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  17. Microdose Induced Drain Leakage Effects in Power Trench MOSFETs: Experiment and Modeling

    NASA Astrophysics Data System (ADS)

    Zebrev, Gennady I.; Vatuev, Alexander S.; Useinov, Rustem G.; Emeliyanov, Vladimir V.; Anashin, Vasily S.; Gorbunov, Maxim S.; Turin, Valentin O.; Yesenkov, Kirill A.

    2014-08-01

    We study experimentally and theoretically the micro-dose induced drain-source leakage current in the trench power MOSFETs under irradiation with high-LET heavy ions. We found experimentally that cumulative increase of leakage current occurs by means of stochastic spikes corresponding to a strike of single heavy ion into the MOSFET gate oxide. We simulate this effect with the proposed analytic model allowing to describe (including Monte Carlo methods) both the deterministic (cumulative dose) and stochastic (single event) aspects of the problem. Based on this model the survival probability assessment in space heavy ion environment with high LETs was proposed.

  18. Effects of biomass smoke on pulmonary functions: a case control study.

    PubMed

    Balcan, Baran; Akan, Selcuk; Ugurlu, Aylin Ozsancak; Handemir, Bahar Ozcelik; Ceyhan, Berrin Bagcı; Ozkaya, Sevket

    2016-01-01

    Biomass smoke is the leading cause of COPD in developing countries such as Turkey. In rural areas of Turkey, females are more exposed to biomass smoke because of traditional lifestyles. The aim of this study was to determine the adverse effects of biomass smoke on pulmonary functions and define the relationship between duration in years and an index (cumulative exposure index) with altered pulmonary function test results. A total of 115 females who lived in the village of Kağizman (a borough of Kars located in the eastern part of Turkey) and were exposed to biomass smoke were included in the study. The control group was generated with 73 individuals living in the same area who were never exposed to biomass smoke. Twenty-seven (23.8%) females in the study group and four (5.5%) in the control group had small airway disease (P=0.038). Twenty-two (19.1%) females in the study group and ten (13.7%) in the control group had obstruction (P=0.223). Twenty (17.3%) females in the study group who were exposed to biomass smoke had restriction compared with ten (13%) in the control group (P=0.189). The duration needed for the existence of small airway disease was 16 years, for obstructive airway disease was 17 years, and for restrictive airway disease was 17 years. The intensity of biomass smoke was defined in terms of cumulative exposure index; it was calculated by multiplying hours per day, weeks per month, and total years of smoke exposure and dividing the result by three. Exposure to biomass smoke is a serious public health problem, especially in rural areas of developing countries, because of its negative effects on pulmonary functions. As the duration and the intensity of exposure increase, the probability of having altered pulmonary function test results is higher.

  19. Cumulants of heat transfer across nonlinear quantum systems

    NASA Astrophysics Data System (ADS)

    Li, Huanan; Agarwalla, Bijay Kumar; Li, Baowen; Wang, Jian-Sheng

    2013-12-01

    We consider thermal conduction across a general nonlinear phononic junction. Based on two-time observation protocol and the nonequilibrium Green's function method, heat transfer in steady-state regimes is studied, and practical formulas for the calculation of the cumulant generating function are obtained. As an application, the general formalism is used to study anharmonic effects on fluctuation of steady-state heat transfer across a single-site junction with a quartic nonlinear on-site pinning potential. An explicit nonlinear modification to the cumulant generating function exact up to the first order is given, in which the Gallavotti-Cohen fluctuation symmetry is found still valid. Numerically a self-consistent procedure is introduced, which works well for strong nonlinearity.

  20. Predicting Recovery from Episodes of Major Depression

    PubMed Central

    Solomon, David A.; Leon, Andrew C.; Coryell, William; Mueller, Timothy I.; Posternak, Michael; Endicott, Jean; Keller, Martin B.

    2008-01-01

    Background This study examined psychosocial functioning as a predictor of recovery from episodes of unipolar major depression. Methods 231 subjects diagnosed with major depressive disorder according to Research Diagnostic Criteria were prospectively followed for up to 20 years as part of the NIMH Collaborative Depression Study. The association between psychosocial functioning and recovery from episodes of unipolar major depression was analyzed with a mixed-effects logistic regression model which controlled for cumulative morbidity, defined as the amount of time ill with major depression during prospective follow-up. Recovery was defined as at least eight consecutive weeks with either no symptoms of major depression, or only one or two symptoms at a mild level of severity. Results In the mixed-effects model, a one standard deviation increase in psychosocial impairment was significantly associated with a 22% decrease in the likelihood of subsequent recovery from an episode of major depression (OR = 0.78, 95% CI: 0.74–0.82, Z = −3.17, p < 0.002). Also, a one standard deviation increase in cumulative morbidity was significantly associated with a 61% decrease in the probability of recovery (OR = 0.3899, 95% CI: 0.3894–0.3903, Z = −7.21, p < 0.001). Limitations The generalizability of the study is limited in so far as subjects were recruited as they sought treatment at academic medical centers. The analyses examined the relationship between psychosocial functioning and recovery from major depression, and did not include episodes of minor depression. Furthermore, this was an observational study and the investigators did not control treatment. Conclusions Assessment of psychosocial impairment may help identify patients less likely to recover from an episode of major depression. PMID:17920692

  1. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  2. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  3. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  4. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  5. 33 CFR 325.3 - Public notice.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) The comment period based on § 325.2(d)(2); (15) A statement that any person may request, in writing... various evaluation factors on which decisions are based shall be included in every public notice. (1... whether to issue a permit will be based on an evaluation of the probable impact including cumulative...

  6. The Priority Heuristic: Making Choices Without Trade-Offs

    PubMed Central

    Brandstätter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph

    2010-01-01

    Bernoulli's framework of expected utility serves as a model for various psychological processes, including motivation, moral sense, attitudes, and decision making. To account for evidence at variance with expected utility, we generalize the framework of fast and frugal heuristics from inferences to preferences. The priority heuristic predicts (i) Allais' paradox, (ii) risk aversion for gains if probabilities are high, (iii) risk seeking for gains if probabilities are low (lottery tickets), (iv) risk aversion for losses if probabilities are low (buying insurance), (v) risk seeking for losses if probabilities are high, (vi) certainty effect, (vii) possibility effect, and (viii) intransitivities. We test how accurately the heuristic predicts people's choices, compared to previously proposed heuristics and three modifications of expected utility theory: security-potential/aspiration theory, transfer-of-attention-exchange model, and cumulative prospect theory. PMID:16637767

  7. Calculation of the Poisson cumulative distribution function

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Nolty, Robert G.; Scheuer, Ernest M.

    1990-01-01

    A method for calculating the Poisson cdf (cumulative distribution function) is presented. The method avoids computer underflow and overflow during the process. The computer program uses this technique to calculate the Poisson cdf for arbitrary inputs. An algorithm that determines the Poisson parameter required to yield a specified value of the cdf is presented.

  8. Initial high anti-emetic efficacy of granisetron with dexamethasone is not maintained over repeated cycles.

    PubMed Central

    de Wit, R.; van den Berg, H.; Burghouts, J.; Nortier, J.; Slee, P.; Rodenburg, C.; Keizer, J.; Fonteyn, M.; Verweij, J.; Wils, J.

    1998-01-01

    We have reported previously that the anti-emetic efficacy of single agent 5HT3 antagonists is not maintained when analysed with the measurement of cumulative probabilities. Presently, the most effective anti-emetic regimen is a combination of a 5HT3 antagonist plus dexamethasone. We, therefore, assessed the sustainment of efficacy of such a combination in 125 patients, scheduled to receive cisplatin > or = 70 mg m(-2) either alone or in combination with other cytotoxic drugs. Anti-emetic therapy was initiated with 10 mg of dexamethasone and 3 mg of granisetron intravenously, before cisplatin. On days 1-6, patients received 8 mg of dexamethasone and 1 mg of granisetron twice daily by oral administration. Protection was assessed during all cycles and calculated based on cumulative probability analyses using the method of Kaplan-Meier and a model for transitional probabilities. Irrespective of the type of analysis used, the anti-emetic efficacy of granisetron/dexamethasone decreased over cycles. The initial complete acute emesis protection rate of 66% decreased to 30% according to the method of Kaplan-Meier and to 39% using the model for transitional probabilities. For delayed emesis, the initial complete protection rate of 52% decreased to 21% (Kaplan-Meier) and to 43% (transitional probabilities). In addition, we observed that protection failure in the delayed emesis period adversely influenced the acute emesis protection in the next cycle. We conclude that the anti-emetic efficacy of a 5HT3 antagonist plus dexamethasone is not maintained over multiple cycles of highly emetogenic chemotherapy, and that the acute emesis protection is adversely influenced by protection failure in the delayed emesis phase. PMID:9652766

  9. Prediction of the 10-year probability of gastric cancer occurrence in the Japanese population: the JPHC study cohort II.

    PubMed

    Charvat, Hadrien; Sasazuki, Shizuka; Inoue, Manami; Iwasaki, Motoki; Sawada, Norie; Shimazu, Taichi; Yamaji, Taiki; Tsugane, Shoichiro

    2016-01-15

    Gastric cancer is a particularly important issue in Japan, where incidence rates are among the highest observed. In this work, we provide a risk prediction model allowing the estimation of the 10-year cumulative probability of gastric cancer occurrence. The study population consisted of 19,028 individuals from the Japanese Public Health Center cohort II who were followed-up from 1993 to 2009. A parametric survival model was used to assess the impact on the probability of gastric cancer of clinical and lifestyle-related risk factors in combination with serum anti-Helicobacter pylori antibody titres and pepsinogen I and pepsinogen II levels. Based on the resulting model, cumulative probability estimates were calculated and a simple risk scoring system was developed. A total of 412 cases of gastric cancer occurred during 270,854 person-years of follow-up. The final model included (besides the biological markers) age, gender, smoking status, family history of gastric cancer and consumption of highly salted food. The developed prediction model showed good predictive performance in terms of discrimination (optimism-corrected c-index: 0.768) and calibration (Nam and d'Agostino's χ(2) test: 14.78; p values = 0.06). Estimates of the 10-year probability of gastric cancer occurrence ranged from 0.04% (0.02, 0.1) to 14.87% (8.96, 24.14) for men and from 0.03% (0.02, 0.07) to 4.91% (2.71, 8.81) for women. In conclusion, we developed a risk prediction model for gastric cancer that combines clinical and biological markers. It might prompt individuals to modify their lifestyle habits, attend regular check-up visits or participate in screening programmes. © 2015 UICC.

  10. Do patients with intake of drugs labelled as sleep disturbing really sleep worse? A population based assessment from the Heinz Nixdorf Recall Study

    PubMed Central

    Kowall, Bernd; Kuß, Oliver; Schmidt‐Pokrzywniak, Andrea; Weinreich, Gerhard; Dragano, Nico; Moebus, Susanne; Erbel, Raimund; Jöckel, Karl‐Heinz; Stang, Andreas

    2016-01-01

    Aim The sleep disturbing effect of many drugs is derived from clinical trials with highly selected patient collectives. However, the generalizability of such findings to the general population is questionable. Our aim was to assess the association between intake of drugs labelled as sleep disturbing and self‐reported nocturnal sleep disturbances in a population‐based study. Methods We used data of 4221 participants (50.0% male) aged 45 to 75 years from the baseline examination of the Heinz Nixdorf Recall Study in Germany. The interview provided information on difficulties falling asleep, difficulties maintaining sleep and early morning arousal. We used the summary of product characteristics (SPC) for each drug taken and assigned the probability of sleep disturbances. Thereafter, we calculated cumulative probabilities of sleep disturbances per subject to account for polypharmacy. We estimated prevalence ratios (PR) using log Poisson regression models with robust variance. Results The adjusted PRs of any regular nocturnal sleep disorder per additional sleep disturbing drug were 1.01 (95% confidence interval (CI) 0.97, 1.06) and 1.03 (95% CI 1.00, 1.07) for men and women, respectively. Estimates for each regular nocturnal sleep disturbance were similarly close to 1. PRs for regular nocturnal sleep disturbances did not increase with rising cumulative probability for drug‐related sleep disturbances. Conclusions SPC‐based probabilities of drug‐related sleep disturbances showed barely any association with self‐reported regular nocturnal sleep disturbances. We conclude that SPC‐based probability information may lack generalizability to the general population or may be of limited data quality. PMID:27279554

  11. Transformation to equivalent dimensions—a new methodology to study earthquake clustering

    NASA Astrophysics Data System (ADS)

    Lasocki, Stanislaw

    2014-05-01

    A seismic event is represented by a point in a parameter space, quantified by the vector of parameter values. Studies of earthquake clustering involve considering distances between such points in multidimensional spaces. However, the metrics of earthquake parameters are different, hence the metric in a multidimensional parameter space cannot be readily defined. The present paper proposes a solution of this metric problem based on a concept of probabilistic equivalence of earthquake parameters. Under this concept the lengths of parameter intervals are equivalent if the probability for earthquakes to take values from either interval is the same. Earthquake clustering is studied in an equivalent rather than the original dimensions space, where the equivalent dimension (ED) of a parameter is its cumulative distribution function. All transformed parameters are of linear scale in [0, 1] interval and the distance between earthquakes represented by vectors in any ED space is Euclidean. The unknown, in general, cumulative distributions of earthquake parameters are estimated from earthquake catalogues by means of the model-free non-parametric kernel estimation method. Potential of the transformation to EDs is illustrated by two examples of use: to find hierarchically closest neighbours in time-space and to assess temporal variations of earthquake clustering in a specific 4-D phase space.

  12. Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.

    PubMed

    Venturi, D; Karniadakis, G E

    2014-06-08

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.

  13. Flow Equation Approach to the Statistics of Nonlinear Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Marston, J. B.; Hastings, M. B.

    2005-03-01

    The probability distribution function of non-linear dynamical systems is governed by a linear framework that resembles quantum many-body theory, in which stochastic forcing and/or averaging over initial conditions play the role of non-zero . Besides the well-known Fokker-Planck approach, there is a related Hopf functional methodootnotetextUriel Frisch, Turbulence: The Legacy of A. N. Kolmogorov (Cambridge University Press, 1995) chapter 9.5.; in both formalisms, zero modes of linear operators describe the stationary non-equilibrium statistics. To access the statistics, we investigate the method of continuous unitary transformationsootnotetextS. D. Glazek and K. G. Wilson, Phys. Rev. D 48, 5863 (1993); Phys. Rev. D 49, 4214 (1994). (also known as the flow equation approachootnotetextF. Wegner, Ann. Phys. 3, 77 (1994).), suitably generalized to the diagonalization of non-Hermitian matrices. Comparison to the more traditional cumulant expansion method is illustrated with low-dimensional attractors. The treatment of high-dimensional dynamical systems is also discussed.

  14. The Lambert Way to Gaussianize Heavy-Tailed Data with the Inverse of Tukey's h Transformation as a Special Case

    PubMed Central

    Goerg, Georg M.

    2015-01-01

    I present a parametric, bijective transformation to generate heavy tail versions of arbitrary random variables. The tail behavior of this heavy tail Lambert  W × F X random variable depends on a tail parameter δ ≥ 0: for δ = 0, Y ≡ X, for δ > 0 Y has heavier tails than X. For X being Gaussian it reduces to Tukey's h distribution. The Lambert W function provides an explicit inverse transformation, which can thus remove heavy tails from observed data. It also provides closed-form expressions for the cumulative distribution (cdf) and probability density function (pdf). As a special case, these yield analytic expression for Tukey's h pdf and cdf. Parameters can be estimated by maximum likelihood and applications to S&P 500 log-returns demonstrate the usefulness of the presented methodology. The R package LambertW implements most of the introduced methodology and is publicly available on CRAN. PMID:26380372

  15. Analysis of the cochlear microphonic to a low-frequency tone embedded in filtered noise

    PubMed Central

    Chertoff, Mark E.; Earl, Brian R.; Diaz, Francisco J.; Sorensen, Janna L.

    2012-01-01

    The cochlear microphonic was recorded in response to a 733 Hz tone embedded in noise that was high-pass filtered at 25 different frequencies. The amplitude of the cochlear microphonic increased as the high-pass cutoff frequency of the noise increased. The amplitude growth for a 60 dB SPL tone was steeper and saturated sooner than that of an 80 dB SPL tone. The growth for both signal levels, however, was not entirely cumulative with plateaus occurring at about 4 and 7 mm from the apex. A phenomenological model of the electrical potential in the cochlea that included a hair cell probability function and spiral geometry of the cochlea could account for both the slope of the growth functions and the plateau regions. This suggests that with high-pass-filtered noise, the cochlear microphonic recorded at the round window comes from the electric field generated at the source directed towards the electrode and not down the longitudinal axis of the cochlea. PMID:23145616

  16. Probabilistic Modeling of High-Temperature Material Properties of a 5-Harness 0/90 Sylramic Fiber/ CVI-SiC/ MI-SiC Woven Composite

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh

    1998-01-01

    An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.

  17. Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems

    PubMed Central

    Venturi, D.; Karniadakis, G. E.

    2014-01-01

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519

  18. Moment Analysis Characterizing Water Flow in Repellent Soils from On- and Sub-Surface Point Sources

    NASA Astrophysics Data System (ADS)

    Xiong, Yunwu; Furman, Alex; Wallach, Rony

    2010-05-01

    Water repellency has a significant impact on water flow patterns in the soil profile. Flow tends to become unstable in such soils, which affects the water availability to plants and subsurface hydrology. In this paper, water flow in repellent soils was experimentally studied using the light reflection method. The transient 2D moisture profiles were monitored by CCD camera for tested soils packed in a transparent flow chamber. Water infiltration experiments and subsequent redistribution from on-surface and subsurface point sources with different flow rates were conducted for two soils of different repellency degrees as well as for wettable soil. We used spatio-statistical analysis (moments) to characterize the flow patterns. The zeroth moment is related to the total volume of water inside the moisture plume, and the first and second moments are affinitive to the center of mass and spatial variances of the moisture plume, respectively. The experimental results demonstrate that both the general shape and size of the wetting plume and the moisture distribution within the plume for the repellent soils are significantly different from that for the wettable soil. The wetting plume of the repellent soils is smaller, narrower, and longer (finger-like) than that of the wettable soil compared with that for the wettable soil that tended to roundness. Compared to the wettable soil, where the soil water content decreases radially from the source, moisture content for the water-repellent soils is higher, relatively uniform horizontally and gradually increases with depth (saturation overshoot), indicating that flow tends to become unstable. Ellipses, defined around the mass center and whose semi-axes represented a particular number of spatial variances, were successfully used to simulate the spatial and temporal variation of the moisture distribution in the soil profiles. Cumulative probability functions were defined for the water enclosed in these ellipses. Practically identical cumulative probability functions (beta distribution) were obtained for all soils, all source types, and flow rates. Further, same distributions were obtained for the infiltration and redistribution processes. This attractive result demonstrates the competence and advantage of the moment analysis method.

  19. Statistics of Advective Stretching in Three-dimensional Incompressible Flows

    NASA Astrophysics Data System (ADS)

    Subramanian, Natarajan; Kellogg, Louise H.; Turcotte, Donald L.

    2009-09-01

    We present a method to quantify kinematic stretching in incompressible, unsteady, isoviscous, three-dimensional flows. We extend the method of Kellogg and Turcotte (J. Geophys. Res. 95:421-432, 1990) to compute the axial stretching/thinning experienced by infinitesimal ellipsoidal strain markers in arbitrary three-dimensional incompressible flows and discuss the differences between our method and the computation of Finite Time Lyapunov Exponent (FTLE). We use the cellular flow model developed in Solomon and Mezic (Nature 425:376-380, 2003) to study the statistics of stretching in a three-dimensional unsteady cellular flow. We find that the probability density function of the logarithm of normalised cumulative stretching (log S) for a globally chaotic flow, with spatially heterogeneous stretching behavior, is not Gaussian and that the coefficient of variation of the Gaussian distribution does not decrease with time as t^{-1/2} . However, it is observed that stretching becomes exponential log S˜ t and the probability density function of log S becomes Gaussian when the time dependence of the flow and its three-dimensionality are increased to make the stretching behaviour of the flow more spatially uniform. We term these behaviors weak and strong chaotic mixing respectively. We find that for strongly chaotic mixing, the coefficient of variation of the Gaussian distribution decreases with time as t^{-1/2} . This behavior is consistent with a random multiplicative stretching process.

  20. Evaluating Oilseed Biofuel Production Feasibility in California’s San Joaquin Valley Using Geophysical and Remote Sensing Techniques

    PubMed Central

    Corwin, Dennis L.; Yemoto, Kevin; Clary, Wes; Banuelos, Gary; Skaggs, Todd H.; Lesch, Scott M.

    2017-01-01

    Though more costly than petroleum-based fuels and a minor component of overall military fuel sources, biofuels are nonetheless strategically valuable to the military because of intentional reliance on multiple, reliable, secure fuel sources. Significant reduction in oilseed biofuel cost occurs when grown on marginally productive saline-sodic soils plentiful in California’s San Joaquin Valley (SJV). The objective is to evaluate the feasibility of oilseed production on marginal soils in the SJV to support a 115 ML yr−1 biofuel conversion facility. The feasibility evaluation involves: (1) development of an Ida Gold mustard oilseed yield model for marginal soils; (2) identification of marginally productive soils; (3) development of a spatial database of edaphic factors influencing oilseed yield and (4) performance of Monte Carlo simulations showing potential biofuel production on marginally productive SJV soils. The model indicates oilseed yield is related to boron, salinity, leaching fraction, and water content at field capacity. Monte Carlo simulations for the entire SJV fit a shifted gamma probability density function: Q = 68.986 + gamma (6.134,5.285), where Q is biofuel production in ML yr−1. The shifted gamma cumulative density function indicates a 0.15–0.17 probability of meeting the target biofuel-production level of 115 ML yr−1, making adequate biofuel production unlikely. PMID:29036925

  1. Serial and Parallel Attentive Visual Searches: Evidence from Cumulative Distribution Functions of Response Times

    ERIC Educational Resources Information Center

    Sung, Kyongje

    2008-01-01

    Participants searched a visual display for a target among distractors. Each of 3 experiments tested a condition proposed to require attention and for which certain models propose a serial search. Serial versus parallel processing was tested by examining effects on response time means and cumulative distribution functions. In 2 conditions, the…

  2. Model-Free CUSUM Methods for Person Fit

    ERIC Educational Resources Information Center

    Armstrong, Ronald D.; Shi, Min

    2009-01-01

    This article demonstrates the use of a new class of model-free cumulative sum (CUSUM) statistics to detect person fit given the responses to a linear test. The fundamental statistic being accumulated is the likelihood ratio of two probabilities. The detection performance of this CUSUM scheme is compared to other model-free person-fit statistics…

  3. An Attachment Theory Approach to Narrating the Faith Journey of Children of Parental Divorce

    ERIC Educational Resources Information Center

    Kiesling, Chris

    2011-01-01

    This study explores the effects of parental divorce on a child's faith. Drawing from attachment theory, Granqvist and Kirkpatrick proposed two probable developmental pathways to religion. For those with secure attachment, whose cumulative experiences of sensitive, religious caregivers enhance the development of a God image as loving; belief…

  4. Early physiological markers of cardiovascular risk in community based adolescents with a depressive disorder.

    PubMed

    Waloszek, Joanna M; Byrne, Michelle L; Woods, Michael J; Nicholas, Christian L; Bei, Bei; Murray, Greg; Raniti, Monika; Allen, Nicholas B; Trinder, John

    2015-04-01

    Depression is recognised as an independent cardiovascular risk factor in adults. Identifying this relationship early on in life is potentially important for the prevention of cardiovascular disease (CVD). This study investigated whether clinical depression is associated with multiple physiological markers of CVD risk in adolescents from the general community. Participants aged 12-18 years were recruited from the general community and screened for depressive symptoms. Individuals with high and low depressive symptoms were administered a diagnostic interview. Fifty participants, 25 with a current depressive episode and 25 matched healthy controls, subsequently completed cardiovascular assessments. Variables assessed were automatic brachial and continuous beat-to-beat finger arterial blood pressure, heart rate, vascular functioning by pulse amplitude tonometry following reactive hyperaemia and pulse transit time (PTT) at rest. Blood samples were collected to measure cholesterol, glucose and glycohaemoglobin levels and an index of cumulative risk of traditional cardiovascular risk factors was calculated. Depressed adolescents had a significantly lower reactive hyperaemia index and shorter PTT, suggesting deterioration in vascular integrity and structure. Higher fasting glucose and triglyceride levels were also observed in the depressed group, who also had higher cumulative risk scores indicative of increased engagement in unhealthy behaviours and higher probability of advanced atherosclerotic lesions. The sample size and number of males who completed all cardiovascular measures was small. Clinically depressed adolescents had poorer vascular functioning and increased CVD risk compared to controls, highlighting the need for early identification and intervention for the prevention of CVD in depressed youth. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Plotting equation for gaussian percentiles and a spreadsheet program for generating probability plots

    USGS Publications Warehouse

    Balsillie, J.H.; Donoghue, J.F.; Butler, K.M.; Koch, J.L.

    2002-01-01

    Two-dimensional plotting tools can be of invaluable assistance in analytical scientific pursuits, and have been widely used in the analysis and interpretation of sedimentologic data. We consider, in this work, the use of arithmetic probability paper (APP). Most statistical computer applications do not allow for the generation of APP plots, because of apparent intractable nonlinearity of the percentile (or probability) axis of the plot. We have solved this problem by identifying an equation(s) for determining plotting positions of Gaussian percentiles (or probabilities), so that APP plots can easily be computer generated. An EXCEL example is presented, and a programmed, simple-to-use EXCEL application template is hereby made publicly available, whereby a complete granulometric analysis including data listing, moment measure calculations, and frequency and cumulative APP plots, is automatically produced.

  6. The Italian national trends in smoking initiation and cessation according to gender and education.

    PubMed

    Sardu, C; Mereu, A; Minerba, L; Contu, P

    2009-09-01

    OBJECTIVES. This study aims to assess the trend in initiation and cessation of smoking across successive birth cohorts, according to gender and education, in order to provide useful suggestion for tobacco control policy. STUDY DESIGN. The study is based on data from the "Health conditions and resort to sanitary services" survey carried out in Italy from October 2004 to September 2005 by the National Institute of Statistics. Through a multisampling procedure a sample representative of the entire national territory was selected. In order to calculate trends in smoking initiation and cessation, data were stratified for birth cohorts, gender and education level, and analyzed through the life table method. The cumulative probability of smoking initiation, across subsequent generations, shows a downward trend followed by a plateau. This result highlights that there is not a shred of evidence to support the hypothesis of an anticipation in smoking initiation. The cumulative probability of quitting, across subsequent generations, follows an upward trend, highlighting the growing tendency of smokers to become an "early quitter", who give up within 30 years of age. Results suggest that the Italian antismoking approach, for the most part targeted at preventing the initiation of smoking emphasising the negative consequences, has an effect on the early smoking cessation. Health policies should reinforce the existing trend of "early quitting" through specific actions. In addition our results show that men with low education exhibit the higher probability of smoking initiation and the lower probability of early quitting, and therefore should be targeted with special attention.

  7. Prediction Uncertainty and Groundwater Management: Approaches to get the Most out of Probabilistic Outputs

    NASA Astrophysics Data System (ADS)

    Peeters, L. J.; Mallants, D.; Turnadge, C.

    2017-12-01

    Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a probabilistic framework, using groundwater metrics derived for a highly stressed groundwater system.

  8. Survival of inlays and partial crowns made of IPS empress after a 10-year observation period and in relation to various treatment parameters.

    PubMed

    Stoll, Richard; Cappel, I; Jablonski-Momeni, Anahita; Pieper, K; Stachniss, V

    2007-01-01

    This study evaluated the long-term survival of inlays and partial crowns made of IPS Empress. For this purpose, the patient data of a prospective study were examined in retrospect and statistically evaluated. All of the inlays and partial crowns fabricated of IPS-Empress within the Department of Operative Dentistry at the School of Dental Medicine of Philipps University, Marburg, Germany were systematically recorded in a database between 1991 and 2001. The corresponding patient files were revised at the end of 2001. The information gathered in this way was used to evaluate the survival of the restorations using the method described by Kaplan and Meyer. A total of n = 1624 restorations were fabricated of IPS-Empress within the observation period. During this time, n = 53 failures were recorded. The remaining restorations were observed for a mean period of 18.77 months. The failures were mainly attributed to fractures, endodontic problems and cementation errors. The last failure was established after 82 months. At this stage, a cumulative survival probability of p = 0.81 was registered with a standard error of 0.04. At this time, n = 30 restorations were still being observed. Restorations on vital teeth (n = 1588) showed 46 failures, with a cumulative survival probability of p = 0.82. Restorations performed on non-vital teeth (n = 36) showed seven failures, with a cumulative survival probability of p = 0.53. Highly significant differences were found between the two groups (p < 0.0001) in a log-rank test. No significant difference (p = 0.41) was found between the patients treated by students (n = 909) and those treated by qualified dentists (n = 715). Likewise, no difference (p = 0.13) was established between the restorations seated with a high viscosity cement (n = 295) and those placed with a low viscosity cement (n = 1329).

  9. Ventilator-associated pneumonia in ARDS patients: the impact of prone positioning. A secondary analysis of the PROSEVA trial.

    PubMed

    Ayzac, L; Girard, R; Baboi, L; Beuret, P; Rabilloud, M; Richard, J C; Guérin, C

    2016-05-01

    The goal of this study was to assess the impact of prone positioning on the incidence of ventilator-associated pneumonia (VAP) and the role of VAP in mortality in a recent multicenter trial performed on patients with severe ARDS. An ancillary study of a prospective multicenter randomized controlled trial on early prone positioning in patients with severe ARDS. In suspected cases of VAP the diagnosis was based on positive quantitative cultures of bronchoalveolar lavage fluid or tracheal aspirate at the 10(4) and 10(7) CFU/ml thresholds, respectively. The VAP cases were then subject to central, independent adjudication. The cumulative probabilities of VAP were estimated in each position group using the Aalen-Johansen estimator and compared using Gray's test. A univariate and a multivariate Cox model was performed to assess the impact of VAP, used as a time-dependent covariate for mortality hazard during the ICU stay. In the supine and prone position groups, the incidence rate for VAP was 1.18 (0.86-1.60) and 1.54 (1.15-2.02) per 100 days of invasive mechanical ventilation (p = 0.10), respectively. The cumulative probability of VAP at 90 days was estimated at 46.5 % (27-66) in the prone group and at 33.5 % (23-44) in the supine group. The difference between the two cumulative probability curves was not statistically significant (p = 0.11). In the univariate Cox model, VAP was associated with an increase in the mortality rate during the ICU stay [HR 1.65 (1.05-2.61), p = 0.03]. HR increased to 2.2 (1.39-3.52) (p < 0.001) after adjustment for position group, age, SOFA score, McCabe score, and immunodeficiency. In severe ARDS patients prone positioning did not reduce the incidence of VAP and VAP was associated with higher mortality.

  10. Cumulative Risk Disparities in Children's Neurocognitive Functioning: A Developmental Cascade Model

    ERIC Educational Resources Information Center

    Wade, Mark; Browne, Dillon T.; Plamondon, Andre; Daniel, Ella; Jenkins, Jennifer M.

    2016-01-01

    The current longitudinal study examined the role of cumulative social risk on children's theory of mind (ToM) and executive functioning (EF) across early development. Further, we also tested a cascade model of development in which children's social cognition at 18 months was hypothesized to predict ToM and EF at age 4.5 through intermediary…

  11. Cumulant generating function formula of heat transfer in ballistic systems with lead-lead coupling and general nonlinear systems

    NASA Astrophysics Data System (ADS)

    Li, Huanan

    2013-03-01

    Based on a two-time observation protocol, we consider heat transfer in a given time interval tM in a lead-junction-lead system taking coupling between the leads into account. In view of the two-time observation, consistency conditions are carefully verified in our specific family of quantum histories. Furthermore, its implication is briefly explored. Then using the nonequilibrium Green's function method, we obtain an exact formula for the cumulant generating function for heat transfer between the two leads, valid in both transient and steady-state regimes. Also, a compact formula for the cumulant generating function in the long-time limit is derived, for which the Gallavotti-Cohen fluctuation symmetry is explicitly verified. In addition, we briefly discuss Di Ventra's repartitioning trick regarding whether the repartitioning procedure of the total Hamiltonian affects the nonequilibrium steady-state current fluctuation. All kinds of properties of nonequilibrium current fluctuations, such as the fluctuation theorem in different time regimes, could be readily given according to these exact formulas. Finally a practical formalism dealing with cumulants of heat transfer across general nonlinear quantum systems is established based on field theoretical/algebraic method.

  12. Reliability analysis of degradable networks with modified BPR

    NASA Astrophysics Data System (ADS)

    Wang, Yu-Qing; Zhou, Chao-Fan; Jia, Bin; Zhu, Hua-Bing

    2017-12-01

    In this paper, the effect of the speed limit on degradable networks with capacity restrictions and the forced flow is investigated. The link performance function considering the road capacity is proposed. Additionally, the probability density distribution and the cumulative distribution of link travel time are introduced in the degradable network. By the mean of distinguishing the value of the speed limit, four cases are discussed, respectively. Means and variances of link travel time and route one of the degradable road network are calculated. Besides, by the mean of performing numerical simulation experiments in a specific network, it is found that the speed limit strategy can reduce the travel time budget and mean travel time of link and route. Moreover, it reveals that the speed limit strategy can cut down variances of the travel time of networks to some extent.

  13. Development of a Probabilistic Component Mode Synthesis Method for the Analysis of Non-Deterministic Substructures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1995-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  14. Impact of Air Pollutants on Oxidative Stress in Common Autophagy-Mediated Aging Diseases

    PubMed Central

    Numan, Mohamed Saber; Brown, Jacques P.; Michou, Laëtitia

    2015-01-01

    Atmospheric pollution-induced cellular oxidative stress is probably one of the pathogenic mechanisms involved in most of the common autophagy-mediated aging diseases, including neurodegenerative diseases such as amyotrophic lateral sclerosis (ALS), Alzheimer’s, disease, as well as Paget’s disease of bone with or without frontotemporal dementia and inclusion body myopathy. Oxidative stress has serious damaging effects on the cellular contents: DNA, RNA, cellular proteins, and cellular organelles. Autophagy has a pivotal role in recycling these damaged non-functional organelles and misfolded or unfolded proteins. In this paper, we highlight, through a narrative review of the literature, that when autophagy processes are impaired during aging, in presence of cumulative air pollution-induced cellular oxidative stress and due to a direct effect on air pollutant, autophagy-mediated aging diseases may occur. PMID:25690002

  15. [Genetic polymorphisms of 21 non-CODIS STR loci].

    PubMed

    Shao, Wei-bo; Zhang, Su-hua; Li, Li

    2011-02-01

    To investigate genetic polymorphisms of 21 non-CODIS STR loci in Han population from the east of China and to explore their forensic application value. Twenty-one non-CODIS STR loci, were amplified with AGCU 21+1 STR kit and DNA samples were obtained from 225 unrelated individuals of the Han population from the east of China. The PCR products were analyzed with 3130 Genetic Analyzer and genotyped with GeneMapper ID v3.2 software. The genetic data were statistically analyzed with PowerStats v12.xls and Cervus 2.0 software. The distributions of 21 non-CODIS STR loci satisfied the Hardy-Weinberg equilibration. The heterozygosity (H) distributions were 0.596-0.804, the discrimination power (DP) were 0.764-0.948, the probability of exclusion of duo-testing (PEduo) were 0.176-0.492, the probability of exclusion of trios-testing (PEtrio) were 0.334-0.663, and the polymorphic information content (PIC) were 0.522-0.807. The cumulative probability of exclusion (CPE) of duo-testing was 0.999707, the CPE of trios-testing was 0.9999994, and the cumulated discrimination power (CDP) was 0.99999999999999999994. Twenty-one non-CODIS STR loci are highly polymorphic. They can be effectively used in personal identification and paternity testing in trios cases. They can also be used as supplement in the difficult cases of diad paternity testing.

  16. A long term study of pulmonary function among US refractory ceramic fibre workers

    PubMed Central

    LeMasters, Grace K; Hilbert, Timothy J; Levin, Linda S; Rice, Carol H; Borton, Eric K; Lockey, James E

    2010-01-01

    Background Cross-sectional studies have shown declines in lung function among refractory ceramic fibre (RCF) workers with increasing fibre exposure. This study followed current and former workers (n=1396) for up to 17 years and collected 5243 pulmonary function tests. Methods Cumulative fibre exposure and production years were categorised into exposure levels at five manufacturing locations. Conventional longitudinal models did not adequately partition age-related changes from other time-dependent variables. Therefore, a restricted cubic spline model was developed to account for the non-linear decline with age. Results Cumulative fibre >60 fibre-months/cc showed a significant loss in lung function at the first test. When results were examined longitudinally, cumulative exposure was confounded with age as workers with the highest cumulative exposure were generally older. A longitudinal model adjusted by age groups was implemented to control for this confounding. No consistent longitudinal loss in lung function was observed with RCF exposure. Smoking, initial weight and weight increase were significant factors. Conclusion No consistent decline was observed longitudinally with exposure to RCF, although cross-sectional and longitudinal findings were discordant. Confounding and accelerated lung function declines with ageing and the correlation of multiple time-dependent variables should be considered in order to minimise error and maximise precision. An innovative statistical methodology for these types of data is described. PMID:20798015

  17. Relationship between virological response and FIB-4 index in chronic hepatitis B patients with entecavir therapy.

    PubMed

    Li, Ni; Xu, Jing-Hang; Yu, Min; Wang, Sa; Si, Chong-Wen; Yu, Yan-Yan

    2015-11-21

    To investigate whether long-term low-level hepatitis B virus (HBV) DNA influences dynamic changes of the FIB-4 index in chronic hepatitis B (CHB) patients receiving entecavir (ETV) therapy with partial virological responses. We retrospectively analyzed 231 nucleos(t)ide (NA) naïve CHB patients from our previous study (NCT01926288) who received continuous ETV or ETV maleate therapy for three years. The patients were divided into partial virological response (PVR) and complete virological response (CVR) groups according to serum HBV DNA levels at week 48. Seventy-six patients underwent biopsies at baseline and at 48 wk. The performance of the FIB-4 index and area under the receiver operating characteristic (AUROC) curve for predicting fibrosis were determined for the patients undergoing biopsy. The primary objective of the study was to compare the cumulative probabilities of virological responses between the two groups during the treatment period. The secondary outcome was to observe dynamic changes of the FIB-4 index between CVR patients and PVR patients. For hepatitis B e antigen (HBeAg)-positive patients (n = 178), the cumulative probability of achieving undetectable levels at week 144 was 95% and 69% for CVR and PVR patients, respectively (P < 0.001). In the Cox proportional hazards model, a lower pretreatment serum HBV DNA level was an independent factor predicting maintained viral suppression. The cumulative probability of achieving undetectable levels of HBV DNA for HBeAg-negative patients (n = 53) did not differ between the two groups. The FIB-4 index efficiently identified fibrosis, with an AUROC of 0.80 (95%CI: 0.69-0.89). For HBeAg-positive patients, the FIB-4 index was higher in CVR patients than in PVR patients at baseline (1.89 ± 1.43 vs 1.18 ± 0.69, P < 0.001). There was no significant difference in the reduction of the FIB-4 index between the CVR and PVR groups from weeks 48 to 144 (-0.11 ± 0.47 vs -0.13 ± 0.49, P = 0.71). At week 144, the FIB-4 index levels were similar between the two groups (1.24 ± 0.87 vs 1.02 ± 0.73, P = 0.06). After multivariate logistic regression analysis, a lower baseline serum HBV DNA level was associated with improvement of liver fibrosis. In HBeAg-negative patients, the FIB-4 index did not differ between the two groups. The cumulative probabilities of HBV DNA responses showed significant differences between CVR and PVR HBeAg-positive CHB patients undergoing entecavir treatment for 144 wk. However, long-term low-level HBV DNA did not deteriorate the FIB-4 index, which was used to evaluate liver fibrosis, at the end of three years.

  18. Pulmonary function of U.S. coal miners related to dust exposure estimates.

    PubMed

    Attfield, M D; Hodous, T K

    1992-03-01

    This study of 7,139 U.S. coal miners used linear regression analysis to relate estimates of cumulative dust exposure to several pulmonary function variables measured during medical examinations undertaken between 1969 and 1971. The exposure data included newly derived cumulative dust exposure estimates for the period up to time of examination based on large data bases of underground airborne dust sampling measurements. Negative associations were found between measures of cumulative exposure and FEV1, FVC, and the FEV1/FVC ratio (p less than 0.001). In general, the relationships were similar to those reported for British coal miners. Overall, the results demonstrate an adverse effect of coal mine dust exposure on pulmonary function that occurs even in the absence of radiographically detected pneumoconiosis.

  19. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given severities) and vulnerability (the probability of a limit state performance be reached, given a certain severity). Then, for each landslide all the exposed goods (structures and infrastructures) within the landslide area and within a buffer (representative of the maximum extension of a landslide given a reactivation), are counted. The risk is the product of the damage probability and the ratio of the exposed goods of each landslide to the whole assets exposed to the same type of landslides. Since the risk is computed numerically and by the same procedure applied to all landslides, it is free from any subjective assessment such as those implied in the qualitative methods.

  20. H-index of Collective Health professors in Brazil.

    PubMed

    Pereira, Julio Cesar Rodrigues; Bronhara, Bruna

    2011-06-01

    To estimate reference values and the hierarchy function of professors engaged in Collective Health in Brazil by analyzing the distribution of the h-index. From the Portal da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (Portal of Coordination for the Improvement of Higher Education Personnel ), 934 authors were identified in 2008, of whom 819 were analyzed. The h-index of each professor was obtained through the Web of Science using search algorithms controlling for namesakes and alternative spellings of their names. For each Brazilian region and for the country as a whole, we adjusted an exponential probability density function to provide the population parameters and rate of decline by region. Ranking measures were identified using the complement of the cumulative probability function and the hierarchy function among authors according to the h-index by region. Among the professors analyzed, 29.8% had no citation record in Web of Science (h=0). The mean h for the country was 3.1, and the region with greatest mean was the southern region (h=4.7). The median h for the country was 3.1, and the greatest median was for the southern region (3.2). Standardizing populations to one hundred, the first rank in the country was h=16, but stratification by region shows that, within the northeastern, southeastern and southern regions, a greater value is necessary for achieving the first rank. In the southern region, the index needed to achieve the first rank was h=24. Most of the Brazilian Collective Health authors, if assessed on the basis of the Web of Science h-index, did not exceed h=5. Regional differences exist, with the southeastern and northeastern regions being similar and the southern region being outstanding.

  1. Cumulative psychosocial risk, parental socialization, and child cognitive functioning: A longitudinal cascade model.

    PubMed

    Wade, Mark; Madigan, Sheri; Plamondon, Andre; Rodrigues, Michelle; Browne, Dillon; Jenkins, Jennifer M

    2018-06-01

    Previous studies have demonstrated that various psychosocial risks are associated with poor cognitive functioning in children, and these risks frequently cluster together. In the current longitudinal study, we tested a model in which it was hypothesized that cumulative psychosocial adversity of mothers would have deleterious effects on children's cognitive functioning by compromising socialization processes within families (i.e., parental competence). A prospective community birth cohort of 501 families was recruited when children were newborns. At this time, mothers reported on their current psychosocial circumstances (socioeconomic status, teen parenthood, depression, etc.), which were summed into a cumulative risk score. Families were followed up at 18 months and 3 years, at which point maternal reflective capacity and cognitive sensitivity were measured, respectively. Child cognition (executive functioning, theory of mind, and language ability) was assessed at age 4.5 using age-appropriate observational and standardized tasks. Analyses controlled for child age, gender, number of children in the home, number of years married, and mothers' history of adversity. The results revealed significant declines in child cognition as well as maternal reflective capacity and cognitive sensitivity as the number of psychosocial risks increased. Moreover, longitudinal path analysis showed significant indirect effects from cumulative risk to all three cognitive outcomes via reflective capacity and cognitive sensitivity. Findings suggest that cumulative risk of mothers may partially account for child cognitive difficulties in various domains by disrupting key parental socialization competencies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. A Model for the Thermal and Chemical Evolution of the Moon's Interior: Implications for the Onset of Mare Volcanism

    NASA Technical Reports Server (NTRS)

    Hess, Paul C.; Parmentier, E. M.

    1995-01-01

    Crystallization of the lunar magma ocean creates a chemically stratified Moon consisting of an anorthositic crust and magma ocean cumulates overlying the primitive lunar interior. Within the magma ocean cumulates the last liquids to crystallize form dense, ilmenite-rich cumulates that contain high concentrations of incompatible radioactive elements. The underlying olivine-orthopyroxene cumulates are also stratified with later crystallized, denser, more Fe-rich compositions at the top. This paper explores the chemical and thermal consequences of an internal evolution model accounting for the possible role of these sources of chemical buoyancy. Rayleigh-Taylor instability causes the dense ilmenite-rich cumulate layer and underlying Fe-rich cumulates to sink toward the center of the Moon, forming a dense lunar core. After this overturn, radioactive heating within the ilmenite-rich cumulate core heats the overlying mantle, causing it to melt. In this model, the source region for high-TiO2 mare basalts is a convectively mixed layer above the core-mantle boundary which would contain small and variable amounts of admixed ilmenite and KREEP. This deep high-pressure melting, as required for mare basalts, occurs after a reasonable time interval to explain the onset of mare basalt volcanism if the content of radioactive elements in the core and the chemical density gradients above the core are sufficiently high but within a range of values that might have been present in the Moon. Regardless of details implied by particular model parameters, gravitational overturn driven by the high density of magma ocean Fe-rich cumulates should concentrate high-TiO2 mare basalt sources, and probably a significant fraction of radioactive heating, toward the center of the Moon. This will have important implications for both the thermal evolution of the Moon and for mare basalt genesis.

  3. Late-life factors associated with healthy aging in older men.

    PubMed

    Bell, Christina L; Chen, Randi; Masaki, Kamal; Yee, Priscilla; He, Qimei; Grove, John; Donlon, Timothy; Curb, J David; Willcox, D Craig; Poon, Leonard W; Willcox, Bradley J

    2014-05-01

    To identify potentially modifiable late-life biological, lifestyle, and sociodemographic factors associated with overall and healthy survival to age 85. Prospective longitudinal cohort study with 21 years of follow-up (1991-2012). Hawaii Lifespan Study. American men of Japanese ancestry (mean age 75.7, range 71-82) without baseline major clinical morbidity and functional impairments (N = 1,292). Overall survival and healthy survival (free from six major chronic diseases and without physical or cognitive impairment) to age 85. Factors were measured at late-life baseline examinations (1991-1993). Of 1,292 participants, 1,000 (77%) survived to 85 (34% healthy) and 309 (24%) to 95 (<1% healthy). Late-life factors associated with survival and healthy survival included biological (body mass index, ankle-brachial index, cognitive score, blood pressure, inflammatory markers), lifestyle (smoking, alcohol use, physical activity), and sociodemographic factors (education, marital status). Cumulative late-life baseline risk factor models demonstrated that age-standardized (at 70) probability of survival to 95 ranged from 27% (no factors) to 7% (≥ 5 factors); probability of survival to 100 ranged from 4% (no factors) to 0.1% (≥ 5 factors). Age-standardized (at 70) probability of healthy survival to 90 ranged from 4% (no factors) to 0.01% (≥ 5 factors). There were nine healthy survivors at 95 and one healthy survivor at 100. Several potentially modifiable risk factors in men in late life (mean age 75.7) were associated with markedly greater probability of subsequent healthy survival and longevity. © 2014, Copyright the Authors Journal compilation © 2014, The American Geriatrics Society.

  4. Distribution of arsenic and copper in sediment pore water: an ecological risk assessment case study for offshore drilling waste discharges.

    PubMed

    Sadiq, Rehan; Husain, Tahir; Veitch, Brian; Bose, Neil

    2003-12-01

    Due to the hydrophobic nature of synthetic based fluids (SBFs), drilling cuttings are not very dispersive in the water column and settle down close to the disposal site. Arsenic and copper are two important toxic heavy metals, among others, found in the drilling waste. In this article, the concentrations of heavy metals are determined using a steady state "aquivalence-based" fate model in a probabilistic mode. Monte Carlo simulations are employed to determine pore water concentrations. A hypothetical case study is used to determine the water quality impacts for two discharge options: 4% and 10% attached SBFs, which correspond to the best available technology option and the current discharge practice in the U.S. offshore. The exposure concentration (CE) is a predicted environmental concentration, which is adjusted for exposure probability and bioavailable fraction of heavy metals. The response of the ecosystem (RE) is defined by developing an empirical distribution function of predicted no-effect concentration. The pollutants' pore water concentrations within the radius of 750 m are estimated and cumulative distributions of risk quotient (RQ=CE/RE) are developed to determine the probability of RQ greater than 1.

  5. Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads

    NASA Technical Reports Server (NTRS)

    Brown, A. M.; McGhee, D. S.

    2003-01-01

    Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.

  6. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    PubMed

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Earth Observing System Covariance Realism

    NASA Technical Reports Server (NTRS)

    Zaidi, Waqar H.; Hejduk, Matthew D.

    2016-01-01

    The purpose of covariance realism is to properly size a primary object's covariance in order to add validity to the calculation of the probability of collision. The covariance realism technique in this paper consists of three parts: collection/calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics. An empirical cumulative distribution function (ECDF) Goodness-of-Fit (GOF) method is employed to determine if a covariance is properly sized by comparing the empirical distribution of Mahalanobis distance calculations to the hypothesized parent 3-DoF chi-squared distribution. To realistically size a covariance for collision probability calculations, this study uses a state noise compensation algorithm that adds process noise to the definitive epoch covariance to account for uncertainty in the force model. Process noise is added until the GOF tests pass a group significance level threshold. The results of this study indicate that when outliers attributed to persistently high or extreme levels of solar activity are removed, the aforementioned covariance realism compensation method produces a tuned covariance with up to 80 to 90% of the covariance propagation timespan passing (against a 60% minimum passing threshold) the GOF tests-a quite satisfactory and useful result.

  8. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of uncertainty in measured water quality data. The results and procedures presented should also assist modelers in quantifying the "quality"of calibration and evaluation data sets, determining model accuracy goals, and evaluating model performance.

  9. A Review on Family History of Breast Cancer: Screening and Counseling Proposals for Women with Familial (Non-Hereditary) Breast Cancer.

    ERIC Educational Resources Information Center

    Unic, Ivana; Stalmeier, Peep F. M.; Peer, Petronella G. M.; van Daal, Willem A. J.

    1997-01-01

    Studies of variables predicting familial breast cancer (N=59) were analyzed to develop screening recommendations for women with nonhereditary familial breast cancer present. The pooled relative risk (RR) and cumulative probability were used to estimate risk. Data and conclusions are presented. Recommendations for screening and counseling are…

  10. The priority heuristic: making choices without trade-offs.

    PubMed

    Brandstätter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph

    2006-04-01

    Bernoulli's framework of expected utility serves as a model for various psychological processes, including motivation, moral sense, attitudes, and decision making. To account for evidence at variance with expected utility, the authors generalize the framework of fast and frugal heuristics from inferences to preferences. The priority heuristic predicts (a) the Allais paradox, (b) risk aversion for gains if probabilities are high, (c) risk seeking for gains if probabilities are low (e.g., lottery tickets), (d) risk aversion for losses if probabilities are low (e.g., buying insurance), (e) risk seeking for losses if probabilities are high, (f) the certainty effect, (g) the possibility effect, and (h) intransitivities. The authors test how accurately the heuristic predicts people's choices, compared with previously proposed heuristics and 3 modifications of expected utility theory: security-potential/aspiration theory, transfer-of-attention-exchange model, and cumulative prospect theory. ((c) 2006 APA, all rights reserved).

  11. Experimental cosmic statistics - I. Variance

    NASA Astrophysics Data System (ADS)

    Colombi, Stéphane; Szapudi, István; Jenkins, Adrian; Colberg, Jörg

    2000-04-01

    Counts-in-cells are measured in the τCDM Virgo Hubble Volume simulation. This large N-body experiment has 109 particles in a cubic box of size 2000h-1Mpc. The unprecedented combination of size and resolution allows, for the first time, a realistic numerical analysis of the cosmic errors and cosmic correlations of statistics related to counts-in-cells measurements, such as the probability distribution function PN itself, its factorial moments Fk and the related cumulants ψ and SNs. These statistics are extracted from the whole simulation cube, as well as from 4096 subcubes of size 125h-1Mpc, each representing a virtual random realization of the local universe. The measurements and their scatter over the subvolumes are compared to the theoretical predictions of Colombi, Bouchet & Schaeffer for P0, and of Szapudi & Colombi and Szapudi, Colombi & Bernardeau for the factorial moments and the cumulants. The general behaviour of experimental variance and cross-correlations as functions of scale and order is well described by theoretical predictions, with a few per cent accuracy in the weakly non-linear regime for the cosmic error on factorial moments. On highly non-linear scales, however, all variants of the hierarchical model used by SC and SCB to describe clustering appear to become increasingly approximate, which leads to a slight overestimation of the error, by about a factor of two in the worst case. Because of the needed supplementary perturbative approach, the theory is less accurate for non-linear estimators, such as cumulants, than for factorial moments. The cosmic bias is evaluated as well, and, in agreement with SCB, is found to be insignificant compared with the cosmic variance in all regimes investigated. While higher order statistics were previously evaluated in several simulations, this work presents textbook quality measurements of SNs, 3<=N<=10, in an unprecedented dynamic range of 0.05 <~ ψ <~ 50. In the weakly non-linear regime the results confirm previous findings and agree remarkably well with perturbation theory predictions including the one-loop corrections based on spherical collapse by Fosalba & Gaztañaga. Extended perturbation theory is confirmed on all scales.

  12. A Geometric View of the Mean of a Set of Numbers

    ERIC Educational Resources Information Center

    Sarkar, Jyotirmoy; Rashid, Mamunur

    2016-01-01

    The sample mean is sometimes depicted as a fulcrum placed under the Dot plot. We provide an alternative geometric visualization of the sample mean using the empirical cumulative distribution function or the cumulative histogram data.

  13. Evidence for criticality in financial data

    NASA Astrophysics Data System (ADS)

    Ruiz, G.; de Marcos, A. F.

    2018-01-01

    We provide evidence that cumulative distributions of absolute normalized returns for the 100 American companies with the highest market capitalization, uncover a critical behavior for different time scales Δt. Such cumulative distributions, in accordance with a variety of complex - and financial - systems, can be modeled by the cumulative distribution functions of q-Gaussians, the distribution function that, in the context of nonextensive statistical mechanics, maximizes a non-Boltzmannian entropy. These q-Gaussians are characterized by two parameters, namely ( q, β), that are uniquely defined by Δt. From these dependencies, we find a monotonic relationship between q and β, which can be seen as evidence of criticality. We numerically determine the various exponents which characterize this criticality.

  14. Measurement of flow harmonics with multi-particle cumulants in Pb+Pb collisions at √s NN = 2.76  TeV with the ATLAS detector

    DOE PAGES

    Aad, G.

    2014-11-26

    ATLAS measurements of the azimuthal anisotropy in lead–lead collisions at √s NN = 2.76 TeV are shown using a dataset of approximately 7 μb –1 collected at the LHC in 2010. The measurements are performed for charged particles with transverse momenta 0.5 < p T < 20 GeV and in the pseudorapidity range |η| < 2.5. The anisotropy is characterized by the Fourier coefficients, v n, of the charged-particle azimuthal angle distribution for n = 2–4. The Fourier coefficients are evaluated using multi-particle cumulants calculated with the generating function method. Results on the transverse momentum, pseudorapidity and centrality dependence ofmore » the v n coefficients are presented. The elliptic flow, v 2, is obtained from the two-, four-, six- and eight-particle cumulants while higher-order coefficients, v 3 and v 4, are determined with two- and four-particle cumulants. Flow harmonics v n measured with four-particle cumulants are significantly reduced compared to the measurement involving two-particle cumulants. A comparison to vn measurements obtained using different analysis methods and previously reported by the LHC experiments is also shown. Results of measurements of flow fluctuations evaluated with multi-particle cumulants are shown as a function of transverse momentum and the collision centrality. As a result, models of the initial spatial geometry and its fluctuations fail to describe the flow fluctuations measurements.« less

  15. Spectrum sensing based on cumulative power spectral density

    NASA Astrophysics Data System (ADS)

    Nasser, A.; Mansour, A.; Yao, K. C.; Abdallah, H.; Charara, H.

    2017-12-01

    This paper presents new spectrum sensing algorithms based on the cumulative power spectral density (CPSD). The proposed detectors examine the CPSD of the received signal to make a decision on the absence/presence of the primary user (PU) signal. Those detectors require the whiteness of the noise in the band of interest. The false alarm and detection probabilities are derived analytically and simulated under Gaussian and Rayleigh fading channels. Our proposed detectors present better performance than the energy (ED) or the cyclostationary detectors (CSD). Moreover, in the presence of noise uncertainty (NU), they are shown to provide more robustness than ED, with less performance loss. In order to neglect the NU, we modified our algorithms to be independent from the noise variance.

  16. Validity of an adaptation of the Framingham cardiovascular risk function: the VERIFICA study

    PubMed Central

    Marrugat, Jaume; Subirana, Isaac; Comín, Eva; Cabezas, Carmen; Vila, Joan; Elosua, Roberto; Nam, Byung‐Ho; Ramos, Rafel; Sala, Joan; Solanas, Pascual; Cordón, Ferran; Gené‐Badia, Joan; D'Agostino, Ralph B

    2007-01-01

    Background To assess the reliability and accuracy of the Framingham coronary heart disease (CHD) risk function adapted by the Registre Gironí del Cor (REGICOR) investigators in Spain. Methods A 5‐year follow‐up study was completed in 5732 participants aged 35–74 years. The adaptation consisted of using in the function the average population risk factor prevalence and the cumulative incidence observed in Spain instead of those from Framingham in a Cox proportional hazards model. Reliability and accuracy in estimating the observed cumulative incidence were tested with the area under the curve comparison and goodness‐of‐fit test, respectively. Results The Kaplan–Meier CHD cumulative incidence during the follow‐up was 4.0% in men and 1.7% in women. The original Framingham function and the REGICOR adapted estimates were 10.4% and 4.8%, and 3.6% and 2.0%, respectively. The REGICOR‐adapted function's estimate did not differ from the observed cumulated incidence (goodness of fit in men, p = 0.078, in women, p = 0.256), whereas all the original Framingham function estimates differed significantly (p<0.001). Reliabilities of the original Framingham function and of the best Cox model fit with the study data were similar in men (area under the receiver operator characteristic curve 0.68 and 0.69, respectively, p = 0.273), whereas the best Cox model fitted better in women (0.73 and 0.81, respectively, p<0.001). Conclusion The Framingham function adapted to local population characteristics accurately and reliably predicted the 5‐year CHD risk for patients aged 35–74 years, in contrast with the original function, which consistently overestimated the actual risk. PMID:17183014

  17. Impact of an effective multidrug-resistant tuberculosis control programme in the setting of an immature HIV epidemic: system dynamics simulation model.

    PubMed

    Atun, Rifat A; Lebcir, Reda; Drobniewski, Francis; Coker, Richard J

    2005-08-01

    This study sought to determine the impact of an effective programme of multidrug resistant tuberculosis control (MDRTB) on a population that is witnessing an explosive HIV epidemic among injecting drug users (IDUs), where the prevalence of MDRTB is already high. A transmission model was constructed that represents the dynamics of the drug-susceptible tuberculosis (DSTB), MDRTB and HIV spread among the adult population of Samara Oblast, Russia: from official notifications of tuberculosis and of HIV infection, estimates of MDRTB derived from surveillance studies, population data from official regional statistics, data on transmission probabilities from peer-reviewed publications and informed estimates, and policy-makers' estimates of IDU populations. Two scenarios of programme effectiveness for MDRTB were modelled and run over a period of 10 years to predict cumulative deaths. In a population of 3.3 million with a high prevalence of MDRTB, an emerging epidemic of HIV among IDUs, and a functioning directly observed therapy-short course (DOTS) programme, the model predicts that under low cure rates for MDRTB the expected cumulative deaths from tuberculosis will reach 6303 deaths including 1900 deaths from MDRTB at 10 years. Under high cure rate for MDRTB 4465 deaths will occur including 134 deaths from MDRTB. At 10 years there is little impact on HIV-infected populations from the MDRTB epidemic, but as the HIV epidemic matures the impact becomes substantial. When the model is extended to 20 years cumulative deaths from MDRTB become very high if cure rates for MDRTB are low and cumulative deaths in the HIV-infected population, likewise, are profoundly affected. In the presence of an immature HIV epidemic failure to actively control MDRTB may result in approximately a third more deaths than if effective treatment is given. As the HIV epidemic matures then the impact of MDRTB grows substantially if MDRTB control strategies are ineffective. The epidemiological starting point for these scenarios is present in many regions within the former Soviet Union and this analysis suggests control of MDRTB should be an urgent priority.

  18. Corneal inflammatory events with daily silicone hydrogel lens wear.

    PubMed

    Szczotka-Flynn, Loretta; Jiang, Ying; Raghupathy, Sangeetha; Bielefeld, Roger A; Garvey, Matthew T; Jacobs, Michael R; Kern, Jami; Debanne, Sara M

    2014-01-01

    This study aimed to determine the probability and risk factors for developing a corneal inflammatory event (CIE) during daily wear of lotrafilcon A silicone hydrogel contact lenses. Eligible participants (n = 218) were fit with lotrafilcon A lenses for daily wear and followed up for 12 months. Participants were randomized to either a polyhexamethylene biguanide-preserved multipurpose solution or a one-step peroxide disinfection system. The main exposures of interest were bacterial contamination of lenses, cases, lid margins, and ocular surface. Kaplan-Meier (KM) plots were used to estimate the cumulative unadjusted probability of remaining free from a CIE, and multivariate Cox proportional hazards regression was used to model the hazard of experiencing a CIE. The KM unadjusted cumulative probability of remaining free from a CIE for both lens care groups combined was 92.3% (95% confidence interval [CI], 88.1 to 96.5%). There was one participant with microbial keratitis, five participants with asymptomatic infiltrates, and seven participants with contact lens peripheral ulcers, providing KM survival estimates of 92.8% (95% CI, 88.6 to 96.9%) and 98.1% (95% CI, 95.8 to 100.0%) for remaining free from noninfectious and symptomatic CIEs, respectively. The presence of substantial (>100 colony-forming units) coagulase-negative staphylococci bioburden on lid margins was associated with about a five-fold increased risk for the development of a CIE (p = 0.04). The probability of experiencing a CIE during daily wear of lotrafilcon A contact lenses is low, and symptomatic CIEs are rare. Patient factors, such as high levels of bacterial bioburden on lid margins, contribute to the development of noninfectious CIEs during daily wear of silicone hydrogel lenses.

  19. Assessing Rates and Predictors of Tachyphylaxis During the Prevention of Recurrent Episodes of Depression With Venlafaxine ER for Two Years (PREVENT) Study

    PubMed Central

    Rothschild, Anthony J.; Dunlop, Boadie W.; Dunner, David L.; Friedman, Edward S.; Gelenberg, Alan; Holland, Peter; Kocsis, James H.; Kornstein, Susan G.; Shelton, Richard; Trivedi, Madhukar H.; Zajecka, John M.; Goldstein, Corey; Thase, Michael E.; Pedersen, Ron; Keller, Martin B.

    2013-01-01

    Background Antidepressant tachyphylaxis describes the return of apathetic depressive symptoms, such as fatigue and decreased motivation, despite continued use of a previously effective treatment. Methods Data were collected from a multiphase, double-blind, placebo-controlled study that assessed the efficacy of venlafaxine extended release (ER) during 2 sequential 1-year maintenance phases (A and B) in patients with recurrent major depressive disorder (MDD). The primary outcome was the cumulative probability of tachyphylaxis in patients receiving venlafaxine ER, fluoxetine, or placebo. Tachyphylaxis was defined as Rothschild Scale for Antidepressant Tachyphylaxis (RSAT) scored ≥ 7 in patients with prior satisfactory therapeutic response. A Kaplan-Meier estimate of the cumulative probability of not experiencing tachyphylaxis, and a 2-sided Fisher exact test was used to assess the relationship between tachyphylaxis and recurrence. Results The maintenance phase A population was comprised of 337 patients (venlafaxine ER [n = 129], fluoxetine [n = 79], placebo [n = 129]), whereas 128 patients (venlafaxine ER [n = 43], fluoxetine [n = 45], placebo [n = 40]) were treated during maintenance phase B. No difference in the probability of experiencing tachyphylaxis were observed between the active treatment groups during either maintenance phase; however, a significant difference between venlafaxine ER and placebo was observed at the completion of maintenance phase A. A significant relationship between tachyphylaxis and recurrence was observed. Limitations Despite demonstrating psychometric validity and reliability, the current definition of tachyphylaxis has not been widely studied Conclusions Although no significant differences were observed in the probability of tachyphylaxis among patients receiving active treatment, the relationship between tachyphylaxis and recurrence suggests that tachyphylaxis may be a predrome of recurrence. PMID:19752838

  20. 40 CFR 230.11 - Factual determinations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... proposed disposal site, and the availability of contaminants. (e) Aquatic ecosystem and organism... individually and cumulatively, on the structure and function of the aquatic ecosystem and organisms... aquatic ecosystem. (1) Cumulative impacts are the changes in an aquatic ecosystem that are attributable to...

  1. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    PubMed

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  2. Surface slip during large Owens Valley earthquakes

    USGS Publications Warehouse

    Haddon, E.K.; Amos, C.B.; Zielke, O.; Jayko, Angela S.; Burgmann, R.

    2016-01-01

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from ∼1.0 to 6.0 m and average 3.3 ± 1.1 m (2σ). Vertical offsets are predominantly east-down between ∼0.1 and 2.4 m, with a mean of 0.8 ± 0.5 m. The average lateral-to-vertical ratio compiled at specific sites is ∼6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7–11 m and net average of 4.4 ± 1.5 m, corresponding to a geologic Mw ∼7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.1 ± 2.0 m, 12.8 ± 1.5 m, and 16.6 ± 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between ∼0.6 and 1.6 mm/yr (1σ) over the late Quaternary.

  3. Greenhouse-gas emission targets for limiting global warming to 2 degrees C.

    PubMed

    Meinshausen, Malte; Meinshausen, Nicolai; Hare, William; Raper, Sarah C B; Frieler, Katja; Knutti, Reto; Frame, David J; Allen, Myles R

    2009-04-30

    More than 100 countries have adopted a global warming limit of 2 degrees C or below (relative to pre-industrial levels) as a guiding principle for mitigation efforts to reduce climate change risks, impacts and damages. However, the greenhouse gas (GHG) emissions corresponding to a specified maximum warming are poorly known owing to uncertainties in the carbon cycle and the climate response. Here we provide a comprehensive probabilistic analysis aimed at quantifying GHG emission budgets for the 2000-50 period that would limit warming throughout the twenty-first century to below 2 degrees C, based on a combination of published distributions of climate system properties and observational constraints. We show that, for the chosen class of emission scenarios, both cumulative emissions up to 2050 and emission levels in 2050 are robust indicators of the probability that twenty-first century warming will not exceed 2 degrees C relative to pre-industrial temperatures. Limiting cumulative CO(2) emissions over 2000-50 to 1,000 Gt CO(2) yields a 25% probability of warming exceeding 2 degrees C-and a limit of 1,440 Gt CO(2) yields a 50% probability-given a representative estimate of the distribution of climate system properties. As known 2000-06 CO(2) emissions were approximately 234 Gt CO(2), less than half the proven economically recoverable oil, gas and coal reserves can still be emitted up to 2050 to achieve such a goal. Recent G8 Communiqués envisage halved global GHG emissions by 2050, for which we estimate a 12-45% probability of exceeding 2 degrees C-assuming 1990 as emission base year and a range of published climate sensitivity distributions. Emissions levels in 2020 are a less robust indicator, but for the scenarios considered, the probability of exceeding 2 degrees C rises to 53-87% if global GHG emissions are still more than 25% above 2000 levels in 2020.

  4. Single-Word Predictions of Upcoming Language During Comprehension: Evidence from the Cumulative Semantic Interference Task

    PubMed Central

    Kleinman, Daniel; Runnqvist, Elin; Ferreira, Victor S.

    2015-01-01

    Comprehenders predict upcoming speech and text on the basis of linguistic input. How many predictions do comprehenders make for an upcoming word? If a listener strongly expects to hear the word “sock”, is the word “shirt” partially expected as well, is it actively inhibited, or is it ignored? The present research addressed these questions by measuring the “downstream” effects of prediction on the processing of subsequently presented stimuli using the cumulative semantic interference paradigm. In three experiments, subjects named pictures (sock) that were presented either in isolation or after strongly constraining sentence frames (“After doing his laundry, Mark always seemed to be missing one…”). Naming sock slowed the subsequent naming of the picture shirt – the standard cumulative semantic interference effect. However, although picture naming was much faster after sentence frames, the interference effect was not modulated by the context (bare vs. sentence) in which either picture was presented. According to the only model of cumulative semantic interference that can account for such a pattern of data, this indicates that comprehenders pre-activated and maintained the pre-activation of best sentence completions (sock) but did not maintain the pre-activation of less likely completions (shirt). Thus, comprehenders predicted only the most probable completion for each sentence. PMID:25917550

  5. Estimation and comparison of cumulative incidences of biliary self-expandable metallic stent dysfunction accounting for competing risks.

    PubMed

    Hamada, Tsuyoshi; Nakai, Yousuke; Isayama, Hiroyuki; Togawa, Osamu; Kogure, Hirofumi; Kawakubo, Kazumichi; Tsujino, Takeshi; Sasahira, Naoki; Hirano, Kenji; Yamamoto, Natsuyo; Ito, Yukiko; Sasaki, Takashi; Mizuno, Suguru; Toda, Nobuo; Tada, Minoru; Koike, Kazuhiko

    2014-03-01

    Self-expandable metallic stent (SEMS) placement is widely carried out for distal malignant biliary obstruction, and survival analysis is used to evaluate the cumulative incidences of SEMS dysfunction (e.g. the Kaplan-Meier [KM] method and the log-rank test). However, these statistical methods might be inappropriate in the presence of 'competing risks' (here, death without SEMS dysfunction), which affects the probability of experiencing the event of interest (SEMS dysfunction); that is, SEMS dysfunction can no longer be observed after death. A competing risk analysis has rarely been done in studies on SEMS. We introduced the concept of a competing risk analysis and illustrated its impact on the evaluation of SEMS outcomes using hypothetical and actual data. Our illustrative study included 476 consecutive patients who underwent SEMS placement for unresectable distal malignant biliary obstruction. A significant difference between cumulative incidences of SEMS dysfunction in male and female patients via theKM method (P = 0.044 by the log-rank test) disappeared after applying a competing risk analysis (P = 0.115 by Gray's test). In contrast, although cumulative incidences of SEMS dysfunction via the KM method were similar with and without chemotherapy (P = 0.647 by the log-rank test), cumulative incidence of SEMS dysfunction in the non-chemotherapy group was shown to be significantly lower (P = 0.031 by Gray's test) in a competing risk analysis. Death as a competing risk event needs to be appropriately considered in estimating a cumulative incidence of SEMS dysfunction, otherwise analytical results may be biased. © 2013 The Authors. Digestive Endoscopy © 2013 Japan Gastroenterological Endoscopy Society.

  6. Cumulant generating function formula of heat transfer in ballistic systems with lead-lead coupling

    NASA Astrophysics Data System (ADS)

    Li, Huanan; Agarwalla, Bijay Kumar; Wang, Jian-Sheng

    2012-10-01

    Based on a two-time observation protocol, we consider heat transfer in a given time interval tM in a lead-junction-lead system taking coupling between the leads into account. In view of the two-time observation, consistency conditions are carefully verified in our specific family of quantum histories. Furthermore, its implication is briefly explored. Then using the nonequilibrium Green's function method, we obtain an exact formula for the cumulant generating function for heat transfer between the two leads, valid in both transient and steady-state regimes. Also, a compact formula for the cumulant generating function in the long-time limit is derived, for which the Gallavotti-Cohen fluctuation symmetry is explicitly verified. In addition, we briefly discuss Di Ventra's repartitioning trick regarding whether the repartitioning procedure of the total Hamiltonian affects the nonequilibrium steady-state current fluctuation. All kinds of properties of nonequilibrium current fluctuations, such as the fluctuation theorem in different time regimes, could be readily given according to these exact formulas.

  7. Constraints on rapidity-dependent initial conditions from charged-particle pseudorapidity densities and two-particle correlations

    NASA Astrophysics Data System (ADS)

    Ke, Weiyao; Moreland, J. Scott; Bernhard, Jonah E.; Bass, Steffen A.

    2017-10-01

    We study the initial three-dimensional spatial configuration of the quark-gluon plasma (QGP) produced in relativistic heavy-ion collisions using centrality and pseudorapidity-dependent measurements of the medium's charged particle density and two-particle correlations. A cumulant-generating function is first used to parametrize the rapidity dependence of local entropy deposition and extend arbitrary boost-invariant initial conditions to nonzero beam rapidities. The model is then compared to p +Pb and Pb + Pb charged-particle pseudorapidity densities and two-particle pseudorapidity correlations and systematically optimized using Bayesian parameter estimation to extract high-probability initial condition parameters. The optimized initial conditions are then compared to a number of experimental observables including the pseudorapidity-dependent anisotropic flows, event-plane decorrelations, and flow correlations. We find that the form of the initial local longitudinal entropy profile is well constrained by these experimental measurements.

  8. Singularity spectrum of intermittent seismic tremor at Kilauea Volcano, Hawaii

    USGS Publications Warehouse

    Shaw, H.R.; Chouet, B.

    1989-01-01

    Fractal singularity analysis (FSA) is used to study a 22-yr record of deep seismic tremor (30-60 km depth) for regions below Kilauea Volcano on the assumption that magma transport and fracture can be treated as a system of coupled nonlinear oscillators. Tremor episodes range from 1 to 100 min (cumulative duration = 1.60 ?? 104 min; yearly average - 727 min yr-1; mean gradient = 24.2 min yr-1km-1). Partitioning of probabilities, Pi, in the phase space of normalized durations, xi, are expressed in terms of a function f(??), where ?? is a variable exponent of a length scale, l. Plots of f(??) vs. ?? are called multifractal singularity spectra. The spectrum for deep tremor durations is bounded by ?? values of about 0.4 and 1.9 at f = O; fmax ???1.0 for ?? ??? 1. Results for tremor are similar to those found for systems transitional between complete mode locking and chaos. -Authors

  9. Probabilistic Analysis of Large-Scale Composite Structures Using the IPACS Code

    NASA Technical Reports Server (NTRS)

    Lemonds, Jeffrey; Kumar, Virendra

    1995-01-01

    An investigation was performed to ascertain the feasibility of using IPACS (Integrated Probabilistic Assessment of Composite Structures) for probabilistic analysis of a composite fan blade, the development of which is being pursued by various industries for the next generation of aircraft engines. A model representative of the class of fan blades used in the GE90 engine has been chosen as the structural component to be analyzed with IPACS. In this study, typical uncertainties are assumed in the level, and structural responses for ply stresses and frequencies are evaluated in the form of cumulative probability density functions. Because of the geometric complexity of the blade, the number of plies varies from several hundred at the root to about a hundred at the tip. This represents a extremely complex composites application for the IPACS code. A sensitivity study with respect to various random variables is also performed.

  10. On buffer overflow duration in a finite-capacity queueing system with multiple vacation policy

    NASA Astrophysics Data System (ADS)

    Kempa, Wojciech M.

    2017-12-01

    A finite-buffer queueing system with Poisson arrivals and generally distributed processing times, operating under multiple vacation policy, is considered. Each time when the system becomes empty, the service station takes successive independent and identically distributed vacation periods, until, at the completion epoch of one of them, at least one job waiting for service is detected in the buffer. Applying analytical approach based on the idea of embedded Markov chain, integral equations and linear algebra, the compact-form representation for the cumulative distribution function (CDF for short) of the first buffer overflow duration is found. Hence, the formula for the CDF of next such periods is obtained. Moreover, probability distributions of the number of job losses in successive buffer overflow periods are found. The considered queueing system can be efficienly applied in modelling energy saving mechanisms in wireless network communication.

  11. User Guide to the Aircraft Cumulative Probability Chart Template

    DTIC Science & Technology

    2009-07-01

    Technology Organisation *AeroStructures Technologies DSTO-TR-2332 ABSTRACT To ensure aircraft structural integrity is maintained to an acceptable level...cracking (or failure) which may be used to assess the life of aircraft structures . RELEASE LIMITATION Approved for public release Report...ADDRESS(ES) DSTO Defence Science and Technology Organisation ,506 Lorimer St,Fishermans Bend Victoria 3207 Australia, , , 8. PERFORMING ORGANIZATION

  12. 12 CFR Appendix A to Subpart A of... - Appendix A to Subpart A of Part 327

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... one year; • Minimum and maximum downgrade probability cutoff values, based on data from June 30, 2008... rate factor (Ai,T) is calculated by subtracting 0.4 from the four-year cumulative gross asset growth... weighted average of five component ratings excluding the “S” component. Delinquency and non-accrual data on...

  13. Understanding characteristics in multivariate traffic flow time series from complex network structure

    NASA Astrophysics Data System (ADS)

    Yan, Ying; Zhang, Shen; Tang, Jinjun; Wang, Xiaofei

    2017-07-01

    Discovering dynamic characteristics in traffic flow is the significant step to design effective traffic managing and controlling strategy for relieving traffic congestion in urban cities. A new method based on complex network theory is proposed to study multivariate traffic flow time series. The data were collected from loop detectors on freeway during a year. In order to construct complex network from original traffic flow, a weighted Froenius norm is adopt to estimate similarity between multivariate time series, and Principal Component Analysis is implemented to determine the weights. We discuss how to select optimal critical threshold for networks at different hour in term of cumulative probability distribution of degree. Furthermore, two statistical properties of networks: normalized network structure entropy and cumulative probability of degree, are utilized to explore hourly variation in traffic flow. The results demonstrate these two statistical quantities express similar pattern to traffic flow parameters with morning and evening peak hours. Accordingly, we detect three traffic states: trough, peak and transitional hours, according to the correlation between two aforementioned properties. The classifying results of states can actually represent hourly fluctuation in traffic flow by analyzing annual average hourly values of traffic volume, occupancy and speed in corresponding hours.

  14. Framework for event-based semidistributed modeling that unifies the SCS-CN method, VIC, PDM, and TOPMODEL

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-09-01

    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.

  15. Relaxation Processes and Time Scale Transformation.

    DTIC Science & Technology

    1982-03-01

    the response function may be immediately recognized as being 14 of the Kubo - Green type in the classical regime. Given this general framework, it is now...discussions of the master equation, 2and has recently been applied in cumulative damage models with discrete time parameter .3 However, it does not seem to...development parameter is taken tG be a positive, cumulative function that increases from an origin monotonically. Consider two continuous time scales e and t

  16. Landsat D Thematic Mapper image dimensionality reduction and geometric correction accuracy

    NASA Technical Reports Server (NTRS)

    Ford, G. E.

    1986-01-01

    To characterize and quantify the performance of the Landsat thematic mapper (TM), techniques for dimensionality reduction by linear transformation have been studied and evaluated and the accuracy of the correction of geometric errors in TM images analyzed. Theoretical evaluations and comparisons for existing methods for the design of linear transformation for dimensionality reduction are presented. These methods include the discrete Karhunen Loeve (KL) expansion, Multiple Discriminant Analysis (MDA), Thematic Mapper (TM)-Tasseled Cap Linear Transformation and Singular Value Decomposition (SVD). A unified approach to these design problems is presented in which each method involves optimizing an objective function with respect to the linear transformation matrix. From these studies, four modified methods are proposed. They are referred to as the Space Variant Linear Transformation, the KL Transform-MDA hybrid method, and the First and Second Version of the Weighted MDA method. The modifications involve the assignment of weights to classes to achieve improvements in the class conditional probability of error for classes with high weights. Experimental evaluations of the existing and proposed methods have been performed using the six reflective bands of the TM data. It is shown that in terms of probability of classification error and the percentage of the cumulative eigenvalues, the six reflective bands of the TM data require only a three dimensional feature space. It is shown experimentally as well that for the proposed methods, the classes with high weights have improvements in class conditional probability of error estimates as expected.

  17. [Theoretical evaluation of the risk of decompression illness during simulated extravehicular activity].

    PubMed

    Nikolaev, V P

    2008-01-01

    Theoretical analysis of the risk of decompression illness (DI) during extravehicular activity following the Russian and NASA decompression protocols (D-R and D-US, respectively) was performed. In contrast to the tradition approach to decompression stress evaluation by the factor of tissue supersaturation with nitrogen, our probabilistic theory of decompression safety provides a completely reasoned evaluation and comparison of the levels of hazard of these decompression protocols. According to this theory, the function of cumulative DI risk is equal to the sum of functions of cumulative risk of lesion of all body tissues by gas bubbles and their supersaturation by solute gases. Based on modeling of dynamics of these functions, growth of the DI cumulative risk in the course of D-R and D-US follows essentially similar trajectories within the time-frame of up to 330 minutes. However, further extension of D-US but not D-R raises the risk of DI drastically.

  18. Cumulative incidence and prevalence of childhood autism in children in Japan.

    PubMed

    Honda, H; Shimizu, Y; Misumi, K; Niimi, M; Ohashi, Y

    1996-08-01

    An epidemiological survey of childhood autism as defined in ICD-10 Research Criteria was conducted in the northern part of Yokohama, Japan. The routine health checkup for 18-month-old children served as the initial mass-screening, and all facilities which provide child care services function to detect all cases with childhood autism and refer them to the Yokohama Rehabilitation Centre. Cumulative incidence of childhood autism up to 5 years of age among the birth cohort of 1988, and prevalence on 1 January 1994, among residents born in 1988 were estimated Cumulative incidence and prevalence were 16.2 per 10,000 and 21.1 per 10,000, respectively. Children with high-functioning autism who had IQs of 70 and over constituted approximately half of all the children with childhood autism. CONCLUSION. It was confirmed through better detection of high-functioning cases that childhood autism in Japan is more common than formerly estimated.

  19. Long-term strength and damage accumulation in laminates

    NASA Astrophysics Data System (ADS)

    Dzenis, Yuris A.; Joshi, Shiv P.

    1993-04-01

    A modified version of the probabilistic model developed by authors for damage evolution analysis of laminates subjected to random loading is utilized to predict long-term strength of laminates. The model assumes that each ply in a laminate consists of a large number of mesovolumes. Probabilistic variation functions for mesovolumes stiffnesses as well as strengths are used in the analysis. Stochastic strains are calculated using the lamination theory and random function theory. Deterioration of ply stiffnesses is calculated on the basis of the probabilities of mesovolumes failures using the theory of excursions of random process beyond the limits. Long-term strength and damage accumulation in a Kevlar/epoxy laminate under tension and complex in-plane loading are investigated. Effects of the mean level and stochastic deviation of loading on damage evolution and time-to-failure of laminate are discussed. Long-term cumulative damage at the time of the final failure at low loading levels is more than at high loading levels. The effect of the deviation in loading is more pronounced at lower mean loading levels.

  20. Selectivity Mechanism of the Nuclear Pore Complex Characterized by Single Cargo Tracking

    PubMed Central

    Lowe, Alan R.; Siegel, Jake J.; Kalab, Petr; Siu, Merek; Weis, Karsten; Liphardt, Jan T.

    2010-01-01

    The Nuclear Pore Complex (NPC) mediates all exchange between the cytoplasm and the nucleus. Small molecules can passively diffuse through the NPC, while larger cargos require transport receptors to translocate1. How the NPC facilitates the translocation of transport receptor/cargo complexes remains unclear. Here, we track single protein-functionalized Quantum Dot (QD) cargos as they translocate the NPC. Import proceeds by successive sub-steps comprising cargo capture, filtering and translocation, and release into the nucleus. The majority of QDs are rejected at one of these steps and return to the cytoplasm including very large cargos that abort at a size-selective barrier. Cargo movement in the central channel is subdiffusive and cargos that can bind more transport receptors diffuse more freely. Without Ran, cargos still explore the entire NPC, but have a markedly reduced probability of exit into the nucleus, suggesting that NPC entry and exit steps are not equivalent and that the pore is functionally asymmetric to importing cargos. The overall selectivity of the NPC appears to arise from the cumulative action of multiple reversible sub-steps and a final irreversible exit step. PMID:20811366

  1. Do Holocaust survivors show increased vulnerability or resilience to post-Holocaust cumulative adversity?

    PubMed

    Shrira, Amit; Palgi, Yuval; Ben-Ezra, Menachem; Shmotkin, Dov

    2010-06-01

    Prior trauma can hinder coping with additional adversity or inoculate against the effect of recurrent adversity. The present study further addressed this issue by examining whether a subsample of Holocaust survivors and comparison groups, drawn from the Israeli component of the Survey of Health, Ageing, and Retirement in Europe, were differentially affected by post-Holocaust cumulative adversity. Post-Holocaust cumulative adversity had a stronger effect on the lifetime depression of Holocaust survivors than on that of comparisons. However, comparisons were more negatively affected by post-Holocaust cumulative adversity when examining markers of physical and cognitive functioning. Our findings suggest that previous trauma can both sensitize and immunize, as Holocaust survivors show general resilience intertwined with specific vulnerability when confronted with additional cumulative adversity.

  2. Defensive functioning of homeless youth in relation to experiences of child maltreatment and cumulative victimization.

    PubMed

    Mounier, Carrie; Andujo, Estela

    2003-10-01

    To determine the relationship between use of defense mechanisms and experiences of child maltreatment and cumulative victimization among homeless youth. Twenty-five homeless youth were individually interviewed regarding their victimization experiences and coping strategies. Use of defense mechanisms was assessed using the Defense Mechanism Rating Scale. Relationships were demonstrated between use of defenses and specific as well as cumulative victimization experiences. All levels of defenses became more pervasive in response to victimization, but this was not a predictor of overall immature defensive functioning. Clinical and program interventions to engage homeless youth need to incorporate an understanding of the relationship between defenses and victimization in order to be effective in maximizing upon the strengths of this population.

  3. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.

  4. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  5. The Neural Basis of Risky Choice with Affective Outcomes

    PubMed Central

    Suter, Renata S.; Pachur, Thorsten; Hertwig, Ralph; Endestad, Tor; Biele, Guido

    2015-01-01

    Both normative and many descriptive theories of decision making under risk are based on the notion that outcomes are weighted by their probability, with subsequent maximization of the (subjective) expected outcome. Numerous investigations from psychology, economics, and neuroscience have produced evidence consistent with this notion. However, this research has typically investigated choices involving relatively affect-poor, monetary outcomes. We compared choice in relatively affect-poor, monetary lottery problems with choice in relatively affect-rich medical decision problems. Computational modeling of behavioral data and model-based neuroimaging analyses provide converging evidence for substantial differences in the respective decision mechanisms. Relative to affect-poor choices, affect-rich choices yielded a more strongly curved probability weighting function of cumulative prospect theory, thus signaling that the psychological impact of probabilities is strongly diminished for affect-rich outcomes. Examining task-dependent brain activation, we identified a region-by-condition interaction indicating qualitative differences of activation between affect-rich and affect-poor choices. Moreover, brain activation in regions that were more active during affect-poor choices (e.g., the supramarginal gyrus) correlated with individual trial-by-trial decision weights, indicating that these regions reflect processing of probabilities. Formal reverse inference Neurosynth meta-analyses suggested that whereas affect-poor choices seem to be based on brain mechanisms for calculative processes, affect-rich choices are driven by the representation of outcomes’ emotional value and autobiographical memories associated with them. These results provide evidence that the traditional notion of expectation maximization may not apply in the context of outcomes laden with affective responses, and that understanding the brain mechanisms of decision making requires the domain of the decision to be taken into account. PMID:25830918

  6. The neural basis of risky choice with affective outcomes.

    PubMed

    Suter, Renata S; Pachur, Thorsten; Hertwig, Ralph; Endestad, Tor; Biele, Guido

    2015-01-01

    Both normative and many descriptive theories of decision making under risk are based on the notion that outcomes are weighted by their probability, with subsequent maximization of the (subjective) expected outcome. Numerous investigations from psychology, economics, and neuroscience have produced evidence consistent with this notion. However, this research has typically investigated choices involving relatively affect-poor, monetary outcomes. We compared choice in relatively affect-poor, monetary lottery problems with choice in relatively affect-rich medical decision problems. Computational modeling of behavioral data and model-based neuroimaging analyses provide converging evidence for substantial differences in the respective decision mechanisms. Relative to affect-poor choices, affect-rich choices yielded a more strongly curved probability weighting function of cumulative prospect theory, thus signaling that the psychological impact of probabilities is strongly diminished for affect-rich outcomes. Examining task-dependent brain activation, we identified a region-by-condition interaction indicating qualitative differences of activation between affect-rich and affect-poor choices. Moreover, brain activation in regions that were more active during affect-poor choices (e.g., the supramarginal gyrus) correlated with individual trial-by-trial decision weights, indicating that these regions reflect processing of probabilities. Formal reverse inference Neurosynth meta-analyses suggested that whereas affect-poor choices seem to be based on brain mechanisms for calculative processes, affect-rich choices are driven by the representation of outcomes' emotional value and autobiographical memories associated with them. These results provide evidence that the traditional notion of expectation maximization may not apply in the context of outcomes laden with affective responses, and that understanding the brain mechanisms of decision making requires the domain of the decision to be taken into account.

  7. Relationship between virological response and FIB-4 index in chronic hepatitis B patients with entecavir therapy

    PubMed Central

    Li, Ni; Xu, Jing-Hang; Yu, Min; Wang, Sa; Si, Chong-Wen; Yu, Yan-Yan

    2015-01-01

    AIM: To investigate whether long-term low-level hepatitis B virus (HBV) DNA influences dynamic changes of the FIB-4 index in chronic hepatitis B (CHB) patients receiving entecavir (ETV) therapy with partial virological responses. METHODS: We retrospectively analyzed 231 nucleos(t)ide (NA) naïve CHB patients from our previous study (NCT01926288) who received continuous ETV or ETV maleate therapy for three years. The patients were divided into partial virological response (PVR) and complete virological response (CVR) groups according to serum HBV DNA levels at week 48. Seventy-six patients underwent biopsies at baseline and at 48 wk. The performance of the FIB-4 index and area under the receiver operating characteristic (AUROC) curve for predicting fibrosis were determined for the patients undergoing biopsy. The primary objective of the study was to compare the cumulative probabilities of virological responses between the two groups during the treatment period. The secondary outcome was to observe dynamic changes of the FIB-4 index between CVR patients and PVR patients. RESULTS: For hepatitis B e antigen (HBeAg)-positive patients (n = 178), the cumulative probability of achieving undetectable levels at week 144 was 95% and 69% for CVR and PVR patients, respectively (P < 0.001). In the Cox proportional hazards model, a lower pretreatment serum HBV DNA level was an independent factor predicting maintained viral suppression. The cumulative probability of achieving undetectable levels of HBV DNA for HBeAg-negative patients (n = 53) did not differ between the two groups. The FIB-4 index efficiently identified fibrosis, with an AUROC of 0.80 (95%CI: 0.69-0.89). For HBeAg-positive patients, the FIB-4 index was higher in CVR patients than in PVR patients at baseline (1.89 ± 1.43 vs 1.18 ± 0.69, P < 0.001). There was no significant difference in the reduction of the FIB-4 index between the CVR and PVR groups from weeks 48 to 144 (-0.11 ± 0.47 vs -0.13 ± 0.49, P = 0.71). At week 144, the FIB-4 index levels were similar between the two groups (1.24 ± 0.87 vs 1.02 ± 0.73, P = 0.06). After multivariate logistic regression analysis, a lower baseline serum HBV DNA level was associated with improvement of liver fibrosis. In HBeAg-negative patients, the FIB-4 index did not differ between the two groups. CONCLUSION: The cumulative probabilities of HBV DNA responses showed significant differences between CVR and PVR HBeAg-positive CHB patients undergoing entecavir treatment for 144 wk. However, long-term low-level HBV DNA did not deteriorate the FIB-4 index, which was used to evaluate liver fibrosis, at the end of three years. PMID:26604649

  8. Associations Between Geriatric Syndromes and Mortality in Community-Dwelling Elderly: Results of a National Longitudinal Study in Taiwan.

    PubMed

    Huang, Chi-Chang; Lee, Jenq-Daw; Yang, Deng-Chi; Shih, Hsin-I; Sun, Chien-Yao; Chang, Chia-Ming

    2017-03-01

    Although geriatric syndromes have been studied extensively, their interactions with one another and their accumulated effects on life expectancy are less frequently discussed. This study examined whether geriatric syndromes and their cumulative effects are associated with risks of mortality in community-dwelling older adults. Data were collected from the Taiwan Longitudinal Study in Aging in 2003, and the participant survival status was followed until December 31, 2007. A total of 2744 participants aged ≥65 years were included in this retrospective cohort study; 634 died during follow-up. Demographic factors, comorbidities, health behaviors, and geriatric syndromes, including underweight, falls, functional impairment, depressive condition, and cognitive impairment, were assessed. Cox proportional hazard regression analysis was used to estimate the hazard ratios (HRs) and 95% confidence intervals (CIs) for the probability of survival according to the cumulative number of geriatric syndromes. The prevalence of geriatric syndromes increased with age. Mortality was significantly associated with age ≥75 years; male sex; ≤6 years of education; history of stroke, malignancy; smoking; not drinking alcohol; and not exercising regularly. Geriatric syndromes, such as underweight, functional disability, and depressive condition, contributed to the risk of mortality. The accumulative model of geriatric syndromes also predicted higher risks of mortality (N = 1, HR 1.50, 95% CI 1.19-1.89; N = 2, HR 1.69, 95% CI 1.25-2.29; N ≥ 3, HR 2.43, 95% CI 1.62-3.66). Community-dwelling older adults who were male, illiterate, receiving institutional care, underweight, experiencing a depressive condition, functionally impaired, and engaging in poor health behavior were more likely to have a higher risk of mortality. The identification of geriatric syndromes might help to improve comprehensive care for community-dwelling older adults. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  9. Cognitive neuroscience 2.0: building a cumulative science of human brain function

    PubMed Central

    Yarkoni, Tal; Poldrack, Russell A.; Van Essen, David C.; Wager, Tor D.

    2010-01-01

    Cognitive neuroscientists increasingly recognize that continued progress in understanding human brain function will require not only the acquisition of new data, but also the synthesis and integration of data across studies and laboratories. Here we review ongoing efforts to develop a more cumulative science of human brain function. We discuss the rationale for an increased focus on formal synthesis of the cognitive neuroscience literature, provide an overview of recently developed tools and platforms designed to facilitate the sharing and integration of neuroimaging data, and conclude with a discussion of several emerging developments that hold even greater promise in advancing the study of human brain function. PMID:20884276

  10. Longitudinal lung function decline and wood dust exposure in the furniture industry.

    PubMed

    Jacobsen, G; Schlünssen, V; Schaumburg, I; Taudorf, E; Sigsgaard, T

    2008-02-01

    The aim of the present study was to investigate the relationship between change in lung function and cumulative exposure to wood dust. In total, 1,112 woodworkers (927 males, 185 females) and 235 reference workers (104 males, 185 females) participated in a 6-yr longitudinal study. Forced expiratory volume in one second (FEV(1)), forced vital capacity (FVC), height and weight were measured, and questionnaire data on respiratory symptoms, wood dust exposure and smoking habits were collected. Cumulative inhalable wood dust exposure was assessed using a study-specific job exposure matrix and exposure time. The median (range) for cumulative wood dust exposure was 3.75 (0-7.55) mg x year x m(-3). A dose-response relationship between cumulative wood dust exposure and percent annual decrease in FEV(1) was suggested for female workers. This was confirmed in a linear regression model adjusted for confounders, including smoking, height and age. An additional difference of -14.50 mL x yr(-1) and -27.97 mL x yr(-1) was revealed for females exposed to 3.75-4.71 mg x yr x m(-3) or to >4.71 mg x yr x m(-3), respectively, compared with non-/low-exposed females. For females, a positive trend between wood dust exposure and the cumulative incidence proportion of FEV(1)/FVC <70% was suggested. In conclusion, in the present low-exposed cohort, female woodworkers had an accelerated decline in lung function, which may be clinically relevant.

  11. Clinical research on liver reserve function by 13C-phenylalanine breath test in aged patients with chronic liver diseases

    PubMed Central

    2010-01-01

    Background The objective of this study was to investigate whether the 13C-phenylalanine breath test could be useful for the evaluation of hepatic function in elderly volunteers and patients with chronic hepatitis B and liver cirrhosis. Methods L-[1-13C] phenylalanine was administered orally at a dose of 100 mg to 55 elderly patients with liver cirrhosis, 30 patients with chronic hepatitis B and 38 elderly healthy subjects. The breath test was performed at 8 different time points (0, 10, 20, 30, 45, 60, 90, 120 min) to obtain the values of Delta over baseline, percentage 13CO2 exhalation rate and cumulative excretion (Cum). The relationships of the cumulative excretion with the 13C-%dose/h and blood biochemical parameters were investigated. Results The 13C-%dose/h at 20 min and 30 min combined with the cumulative excretion at 60 min and 120 min correlated with hepatic function tests, serum albumin, hemoglobin, platelet and Child-Pugh score. Prothrombin time, total and direct bilirubin were significantly increased, while serum albumin, hemoglobin and platelet, the cumulative excretion at 60 min and 120 min values decreased by degrees of intensity of the disease in Child-Pugh A, B, and C patients (P < 0.01). Conclusions The 13C-phenylalanine breath test can be used as a non-invasive assay to evaluate hepatic function in elderly patients with liver cirrhosis. The 13C-%dose/h at 20 min, at 30 min and cumulative excretion at 60 min may be the key value for determination at a single time-point. 13C-phenylalanine breath test is safe and helpful in distinguishing different stages of hepatic dysfunction for elderly cirrhosis patients. PMID:20459849

  12. Origins and implications of the relationship between warming and cumulative carbon emissions

    NASA Astrophysics Data System (ADS)

    Raupach, M. R.; Davis, S. J.; Peters, G. P.; Andrew, R. M.; Canadell, J.; Le Quere, C.

    2014-12-01

    A near-linear relationship between warming (T) and cumulative carbon emissions (Q) is a robust finding from numerous studies. This finding opens biophysical questions concerning (1) its theoretical basis, (2) the treatment of non-CO2 forcings, and (3) uncertainty specifications. Beyond these biophysical issues, a profound global policy question is raised: (4) how can a quota on cumulative emissions be shared? Here, an integrated survey of all four issues is attempted. (1) Proportionality between T and Q is an emergent property of a linear carbon-climate system forced by exponentially increasing CO2 emissions. This idealisation broadly explains past but not future near-proportionality between T and Q: in future, the roles of non-CO2 forcings and carbon-climate nonlinearities become important, and trajectory dependence becomes stronger. (2) The warming effects of short-lived non-CO2 forcers depend on instantaneous rather than cumulative fluxes. However, inertia in emissions trajectories reinstates some of the benefits of a cumulative emissions approach, with residual trajectory dependence comparable to that for CO2. (3) Uncertainties arise from several sources: climate projections, carbon-climate feedbacks, and residual trajectory dependencies in CO2 and other emissions. All of these can in principle be combined into a probability distribution P(T|Q) for the warming T from given cumulative CO2 emissions Q. Present knowledge of P(T|Q) allows quantification of the tradeoff between mitigation ambition and climate risk. (4) Cumulative emissions consistent with a given warming target and climate risk are a finite common resource that will inevitably be shared, creating a tragedy-of-the-commons dilemma. Sharing options range from "inertia" (present distribution of emissions is maintained) to "equity" (cumulative emissions are distributed equally per-capita). Both extreme options lead to emissions distributions that are unrealisable in practice, but a blend of the two extremes may be realisable. This perspective provides a means for nations to compare the global consequences of their own proposed emissions quotas if others were to act in a consistent way, a critical step towards achieving consensus.

  13. Role of olivine cumulates in destabilizing the flanks of Hawaiian volcanoes

    USGS Publications Warehouse

    Clague, D.A.; Denlinger, R.P.

    1994-01-01

    The south flank of Kilauea Volcano is unstable and has the structure of a huge landslide; it is one of at least 17 enormous catastrophic landslides shed from the Hawaiian Islands. Mechanisms previously proposed for movement of the south flank invoke slip of the volcanic pile over seafloor sediments. Slip on a low friction de??collement alone cannot explain why the thickest and widest sector of the flank moves more rapidly than the rest, or why this section contains a 300 km3 aseismic volume above the seismically defined de??collement. It is proposed that this aseismic volume, adjacent to the caldera in the direction of flank slip, consists of olivine cumulates that creep outward, pushing the south flank seawards. Average primary Kilauea tholeiitic magma contains about 16.5 wt.% MgO compared with an average 10 wt.% MgO for erupted subaerial and submarine basalts. This difference requires fractionation of 17 wt.% (14 vol.%) olivine phenocrysts that accumulate near the base of the magma reservoir where they form cumulates. Submarine-erupted Kilauea lavas contain abundant deformed olivine xenocrysts derived from these cumulates. Deformed dunite formed during the tholeiitic shield stage is also erupted as xenoliths in subsequent alkalic lavas. The deformation structures in olivine xenocrysts suggest that the cumulus olivine was densely packed, probably with as little as 5-10 vol.% intercumulus liquid, before entrainment of the xenocrysts. The olivine cumulates were at magmatic temperatures (>1100??C) when the xenocrysts were entrained. Olivine at 1100??C has a rheology similar to ice, and the olivine cumulates should flow down and away from the summit of the volcano. Flow of the olivine cumulates places constant pressure on the unbuttressed seaward flank, leading to an extensional region that localizes deep intrusions behind the flank; these intrusions add to the seaward push. This mechanism ties the source of gravitational instability to the caldera complex and deep rift systems and, therefore, limits catastrophic sector failure of Hawaiian volcanoes to their active growth phase, when the core of olivine cumulates is still hot enough to flow. ?? 1994 Springer-Verlag.

  14. Metabolomics variable selection and classification in the presence of observations below the detection limit using an extension of ERp.

    PubMed

    van Reenen, Mari; Westerhuis, Johan A; Reinecke, Carolus J; Venter, J Hendrik

    2017-02-02

    ERp is a variable selection and classification method for metabolomics data. ERp uses minimized classification error rates, based on data from a control and experimental group, to test the null hypothesis of no difference between the distributions of variables over the two groups. If the associated p-values are significant they indicate discriminatory variables (i.e. informative metabolites). The p-values are calculated assuming a common continuous strictly increasing cumulative distribution under the null hypothesis. This assumption is violated when zero-valued observations can occur with positive probability, a characteristic of GC-MS metabolomics data, disqualifying ERp in this context. This paper extends ERp to address two sources of zero-valued observations: (i) zeros reflecting the complete absence of a metabolite from a sample (true zeros); and (ii) zeros reflecting a measurement below the detection limit. This is achieved by allowing the null cumulative distribution function to take the form of a mixture between a jump at zero and a continuous strictly increasing function. The extended ERp approach is referred to as XERp. XERp is no longer non-parametric, but its null distributions depend only on one parameter, the true proportion of zeros. Under the null hypothesis this parameter can be estimated by the proportion of zeros in the available data. XERp is shown to perform well with regard to bias and power. To demonstrate the utility of XERp, it is applied to GC-MS data from a metabolomics study on tuberculosis meningitis in infants and children. We find that XERp is able to provide an informative shortlist of discriminatory variables, while attaining satisfactory classification accuracy for new subjects in a leave-one-out cross-validation context. XERp takes into account the distributional structure of data with a probability mass at zero without requiring any knowledge of the detection limit of the metabolomics platform. XERp is able to identify variables that discriminate between two groups by simultaneously extracting information from the difference in the proportion of zeros and shifts in the distributions of the non-zero observations. XERp uses simple rules to classify new subjects and a weight pair to adjust for unequal sample sizes or sensitivity and specificity requirements.

  15. The 1/ N Expansion of Tensor Models Beyond Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Gurau, Razvan

    2014-09-01

    We analyze in full mathematical rigor the most general quartically perturbed invariant probability measure for a random tensor. Using a version of the Loop Vertex Expansion (which we call the mixed expansion) we show that the cumulants write as explicit series in 1/ N plus bounded rest terms. The mixed expansion recasts the problem of determining the subleading corrections in 1/ N into a simple combinatorial problem of counting trees decorated by a finite number of loop edges. As an aside, we use the mixed expansion to show that the (divergent) perturbative expansion of the tensor models is Borel summable and to prove that the cumulants respect an uniform scaling bound. In particular the quartically perturbed measures fall, in the N→ ∞ limit, in the universality class of Gaussian tensor models.

  16. Model for Cumulative Solar Heavy Ion Energy and LET Spectra

    NASA Technical Reports Server (NTRS)

    Xapsos, Mike; Barth, Janet; Stauffer, Craig; Jordan, Tom; Mewaldt, Richard

    2007-01-01

    A probabilistic model of cumulative solar heavy ion energy and lineary energy transfer (LET) spectra is developed for spacecraft design applications. Spectra are given as a function of confidence level, mission time period during solar maximum and shielding thickness. It is shown that long-term solar heavy ion fluxes exceed galactic cosmic ray fluxes during solar maximum for shielding levels of interest. Cumulative solar heavy ion fluences should therefore be accounted for in single event effects rate calculations and in the planning of space missions.

  17. Analytical theory of mesoscopic Bose-Einstein condensation in an ideal gas

    NASA Astrophysics Data System (ADS)

    Kocharovsky, Vitaly V.; Kocharovsky, Vladimir V.

    2010-03-01

    We find the universal structure and scaling of the Bose-Einstein condensation (BEC) statistics and thermodynamics (Gibbs free energy, average energy, heat capacity) for a mesoscopic canonical-ensemble ideal gas in a trap with an arbitrary number of atoms, any volume, and any temperature, including the whole critical region. We identify a universal constraint-cutoff mechanism that makes BEC fluctuations strongly non-Gaussian and is responsible for all unusual critical phenomena of the BEC phase transition in the ideal gas. The main result is an analytical solution to the problem of critical phenomena. It is derived by, first, calculating analytically the universal probability distribution of the noncondensate occupation, or a Landau function, and then using it for the analytical calculation of the universal functions for the particular physical quantities via the exact formulas which express the constraint-cutoff mechanism. We find asymptotics of that analytical solution as well as its simple analytical approximations which describe the universal structure of the critical region in terms of the parabolic cylinder or confluent hypergeometric functions. The obtained results for the order parameter, all higher-order moments of BEC fluctuations, and thermodynamic quantities perfectly match the known asymptotics outside the critical region for both low and high temperature limits. We suggest two- and three-level trap models of BEC and find their exact solutions in terms of the cutoff negative binomial distribution (which tends to the cutoff gamma distribution in the continuous limit) and the confluent hypergeometric distribution, respectively. Also, we present an exactly solvable cutoff Gaussian model of BEC in a degenerate interacting gas. All these exact solutions confirm the universality and constraint-cutoff origin of the strongly non-Gaussian BEC statistics. We introduce a regular refinement scheme for the condensate statistics approximations on the basis of the infrared universality of higher-order cumulants and the method of superposition and show how to model BEC statistics in the actual traps. In particular, we find that the three-level trap model with matching the first four or five cumulants is enough to yield remarkably accurate results for all interesting quantities in the whole critical region. We derive an exact multinomial expansion for the noncondensate occupation probability distribution and find its high-temperature asymptotics (Poisson distribution) and corrections to it. Finally, we demonstrate that the critical exponents and a few known terms of the Taylor expansion of the universal functions, which were calculated previously from fitting the finite-size simulations within the phenomenological renormalization-group theory, can be easily obtained from the presented full analytical solutions for the mesoscopic BEC as certain approximations in the close vicinity of the critical point.

  18. Early Course of Inflammatory Bowel Disease in a Population-Based Inception Cohort Study From 8 Countries in Asia and Australia.

    PubMed

    Ng, Siew C; Zeng, Zhirong; Niewiadomski, Ola; Tang, Whitney; Bell, Sally; Kamm, Michael A; Hu, Pinjin; de Silva, H Janaka; Niriella, Madunil A; Udara, W S A A Yasith; Ong, David; Ling, Khoon Lin; Ooi, Choon Jin; Hilmi, Ida; Lee Goh, Khean; Ouyang, Qin; Wang, Yu Fang; Wu, Kaichun; Wang, Xin; Pisespongsa, Pises; Manatsathit, Sathaporn; Aniwan, Satimai; Limsrivilai, Julajak; Gunawan, Jeffri; Simadibrata, Marcellus; Abdullah, Murdani; Tsang, Steve W C; Lo, Fu Hang; Hui, Aric J; Chow, Chung Mo; Yu, Hon Ho; Li, Mo Fong; Ng, Ka Kei; Ching, Jessica Y L; Chan, Victor; Wu, Justin C Y; Chan, Francis K L; Chen, Minhu; Sung, Joseph J Y

    2016-01-01

    The incidence of inflammatory bowel disease (IBD) is increasing in Asia, but little is known about disease progression in this region. The Asia-Pacific Crohn's and Colitis Epidemiology Study was initiated in 2011, enrolling subjects from 8 countries in Asia (China, Hong Kong, Indonesia, Sri Lanka, Macau, Malaysia, Singapore, and Thailand) and Australia. We present data from this ongoing study. We collected data on 413 patients diagnosed with IBD (222 with ulcerative colitis [UC], 181 with Crohn's disease [CD], 10 with IBD unclassified; median age, 37 y) from 2011 through 2013. We analyzed the disease course and severity and mortality. Risks for medical and surgical therapies were assessed using Kaplan-Meier analysis. The cumulative probability that CD would change from inflammatory to stricturing or penetrating disease was 19.6%. The cumulative probabilities for use of immunosuppressants or anti-tumor necrosis factor agents were 58.9% and 12.0% for patients with CD, and 12.7% and 0.9% for patients with UC, respectively. Perianal CD was associated with an increased risk of anti-tumor necrosis factor therapy within 1 year of its diagnosis (hazard ratio, 2.97; 95% confidence interval, 1.09-8.09). The cumulative probabilities for surgery 1 year after diagnosis were 9.1% for patients with CD and 0.9% for patients with UC. Patients with CD and penetrating disease had a 7-fold increase for risk of surgery, compared with patients with inflammatory disease (hazard ratio, 7.67; 95% confidence interval, 3.93-14.96). The overall mortality for patients with IBD was 0.7%. In a prospective population-based study, we found that the early course of disease in patients with IBD in Asia was comparable with that of the West. Patients with CD frequently progress to complicated disease and have accelerated use of immunosuppressants. Few patients with early stage UC undergo surgery in Asia. Increasing our understanding of IBD progression in different populations can help optimize therapy and improve outcomes. Copyright © 2016 AGA Institute. Published by Elsevier Inc. All rights reserved.

  19. Stochastic static fault slip inversion from geodetic data with non-negativity and bound constraints

    NASA Astrophysics Data System (ADS)

    Nocquet, J.-M.

    2018-07-01

    Despite surface displacements observed by geodesy are linear combinations of slip at faults in an elastic medium, determining the spatial distribution of fault slip remains a ill-posed inverse problem. A widely used approach to circumvent the illness of the inversion is to add regularization constraints in terms of smoothing and/or damping so that the linear system becomes invertible. However, the choice of regularization parameters is often arbitrary, and sometimes leads to significantly different results. Furthermore, the resolution analysis is usually empirical and cannot be made independently of the regularization. The stochastic approach of inverse problems provides a rigorous framework where the a priori information about the searched parameters is combined with the observations in order to derive posterior probabilities of the unkown parameters. Here, I investigate an approach where the prior probability density function (pdf) is a multivariate Gaussian function, with single truncation to impose positivity of slip or double truncation to impose positivity and upper bounds on slip for interseismic modelling. I show that the joint posterior pdf is similar to the linear untruncated Gaussian case and can be expressed as a truncated multivariate normal (TMVN) distribution. The TMVN form can then be used to obtain semi-analytical formulae for the single, 2-D or n-D marginal pdf. The semi-analytical formula involves the product of a Gaussian by an integral term that can be evaluated using recent developments in TMVN probabilities calculations. Posterior mean and covariance can also be efficiently derived. I show that the maximum posterior (MAP) can be obtained using a non-negative least-squares algorithm for the single truncated case or using the bounded-variable least-squares algorithm for the double truncated case. I show that the case of independent uniform priors can be approximated using TMVN. The numerical equivalence to Bayesian inversions using Monte Carlo Markov chain (MCMC) sampling is shown for a synthetic example and a real case for interseismic modelling in Central Peru. The TMVN method overcomes several limitations of the Bayesian approach using MCMC sampling. First, the need of computer power is largely reduced. Second, unlike Bayesian MCMC-based approach, marginal pdf, mean, variance or covariance are obtained independently one from each other. Third, the probability and cumulative density functions can be obtained with any density of points. Finally, determining the MAP is extremely fast.

  20. Toward the Probabilistic Forecasting of High-latitude GPS Phase Scintillation

    NASA Technical Reports Server (NTRS)

    Prikryl, P.; Jayachandran, P.T.; Mushini, S. C.; Richardson, I. G.

    2012-01-01

    The phase scintillation index was obtained from L1 GPS data collected with the Canadian High Arctic Ionospheric Network (CHAIN) during years of extended solar minimum 2008-2010. Phase scintillation occurs predominantly on the dayside in the cusp and in the nightside auroral oval. We set forth a probabilistic forecast method of phase scintillation in the cusp based on the arrival time of either solar wind corotating interaction regions (CIRs) or interplanetary coronal mass ejections (ICMEs). CIRs on the leading edge of high-speed streams (HSS) from coronal holes are known to cause recurrent geomagnetic and ionospheric disturbances that can be forecast one or several solar rotations in advance. Superposed epoch analysis of phase scintillation occurrence showed a sharp increase in scintillation occurrence just after the arrival of high-speed solar wind and a peak associated with weak to moderate CMEs during the solar minimum. Cumulative probability distribution functions for the phase scintillation occurrence in the cusp are obtained from statistical data for days before and after CIR and ICME arrivals. The probability curves are also specified for low and high (below and above median) values of various solar wind plasma parameters. The initial results are used to demonstrate a forecasting technique on two example periods of CIRs and ICMEs.

  1. The Tail Exponent for Stock Returns in Bursa Malaysia for 2003-2008

    NASA Astrophysics Data System (ADS)

    Rusli, N. H.; Gopir, G.; Usang, M. D.

    2010-07-01

    A developed discipline of econophysics that has been introduced is exhibiting the application of mathematical tools that are usually applied to the physical models for the study of financial models. In this study, an analysis of the time series behavior of several blue chip and penny stock companies in Main Market of Bursa Malaysia has been performed. Generally, the basic quantity being used is the relative price changes or is called the stock price returns, contains daily-sampled data from the beginning of 2003 until the end of 2008, containing 1555 trading days recorded. The aim of this paper is to investigate the tail exponent in tails of the distribution for blue chip stocks and penny stocks financial returns in six years period. By using a standard regression method, it is found that the distribution performed double scaling on the log-log plot of the cumulative probability of the normalized returns. Thus we calculate α for a small scale return as well as large scale return. Based on the result obtained, it is found that the power-law behavior for the probability density functions of the stock price absolute returns P(z)˜z-α with values lying inside and outside the Lévy stable regime with values α>2. All the results were discussed in detail.

  2. Fourth-Order Vibrational Transition State Theory and Chemical Kinetics

    NASA Astrophysics Data System (ADS)

    Stanton, John F.; Matthews, Devin A.; Gong, Justin Z.

    2015-06-01

    Second-order vibrational perturbation theory (VPT2) is an enormously successful and well-established theory for treating anharmonic effects on the vibrational levels of semi-rigid molecules. Partially as a consequence of the fact that the theory is exact for the Morse potential (which provides an appropriate qualitative model for stretching anharmonicity), VPT2 calculations for such systems with appropriate ab initio potential functions tend to give fundamental and overtone levels that fall within a handful of wavenumbers of experimentally measured positions. As a consequence, the next non-vanishing level of perturbation theory -- VPT4 -- offers only slight improvements over VPT2 and is not practical for most calculations since it requires information about force constants up through sextic. However, VPT4 (as well as VPT2) can be used for other applications such as the next vibrational correction to rotational constants (the ``gammas'') and other spectroscopic parameters. In addition, the marriage of VPT with the semi-classical transition state theory of Miller (SCTST) has recently proven to be a powerful and accurate treatment for chemical kinetics. In this talk, VPT4-based SCTST tunneling probabilities and cumulative reaction probabilities are give for the first time for selected low-dimensional model systems. The prospects for VPT4, both practical and intrinsic, will also be discussed.

  3. Cumulative exposure to dust causes accelerated decline in lung function in tunnel workers

    PubMed Central

    Ulvestad, B; Bakke, B; Eduard, W; Kongerud, J; Lund, M

    2001-01-01

    OBJECTIVES—To examine whether underground construction workers exposed to tunnelling pollutants over a follow up period of 8 years have an increased risk of decline in lung function and respiratory symptoms compared with reference subjects working outside the tunnel atmosphere, and relate the findings to job groups and cumulative exposure to dust and gases.
METHODS—96 Tunnel workers and a reference group of 249 other heavy construction workers were examined in 1991 and re-examined in 1999. Exposure measurements were carried out to estimate personal cumulative exposure to total dust, respirable dust, α-quartz, oil mist, and nitrogen dioxide. The subjects answered a questionnaire on respiratory symptoms and smoking habits, performed spirometry, and had chest radiographs taken. Radiological signs of silicosis were evaluated (International Labour Organisation (ILO) classification). Atopy was determined by a multiple radioallergosorbent test (RAST).
RESULTS—The mean exposure to respirable dust and α-quartz in tunnel workers varied from 1.2-3.6 mg/m3 (respirable dust) and 0.019-0.044 mg/m3 (α-quartz) depending on job task performed. Decrease in forced expiratory volume in 1 second (FEV1) was associated with cumulative exposure to respirable dust (p<0.001) and α-quartz (p=0.02). The multiple regression model predicted that in a worker 40 years of age, the annual decrease in FEV1 would be 25 ml in a non-exposed non-smoker, 35 ml in a non-exposed smoker, and 50-63 ml in a non-smoking tunnel worker (depending on job). Compared with the reference group the odds ratio for the occurrence of new respiratory symptoms during the follow up period was increased in the tunnel workers and associated with cumulative exposure to respirable dust.
CONCLUSIONS—Cumulative exposures to respirable dust and α-quartz are the most important risk factors for airflow limitation in underground heavy construction workers, and cumulative exposure to respirable dust is the most important risk factor for respiratory symptoms. The finding of accelerated decline in lung function in tunnel workers suggests that better control of exposures is needed.


Keywords: heavy construction; respirable dust; lung function PMID:11555688

  4. Assessing Quality in Higher Education: New Criteria for Evaluating Students' Satisfaction

    ERIC Educational Resources Information Center

    Zineldin, Mosad; Akdag, Hatice Camgoz; Vasicheva, Valentina

    2011-01-01

    The aim of this research is to present a new quality assurance model (5Qs) and to examine the major factors affecting students' perception of cumulative satisfaction. The model includes behavioural dimensions of student satisfaction. The factors included in this cumulative summation are technical, functional, infrastructure, interaction and…

  5. Are Earthquake Clusters/Supercycles Real or Random?

    NASA Astrophysics Data System (ADS)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2016-12-01

    Long records of earthquakes at plate boundaries such as the San Andreas or Cascadia often show that large earthquakes occur in temporal clusters, also termed supercycles, separated by less active intervals. These are intriguing because the boundary is presumably being loaded by steady plate motion. If so, earthquakes resulting from seismic cycles - in which their probability is small shortly after the past one, and then increases with time - should occur quasi-periodically rather than be more frequent in some intervals than others. We are exploring this issue with two approaches. One is to assess whether the clusters result purely by chance from a time-independent process that has no "memory." Thus a future earthquake is equally likely immediately after the past one and much later, so earthquakes can cluster in time. We analyze the agreement between such a model and inter-event times for Parkfield, Pallet Creek, and other records. A useful tool is transformation by the inverse cumulative distribution function, so the inter-event times have a uniform distribution when the memorylessness property holds. The second is via a time-variable model in which earthquake probability increases with time between earthquakes and decreases after an earthquake. The probability of an event increases with time until one happens, after which it decreases, but not to zero. Hence after a long period of quiescence, the probability of an earthquake can remain higher than the long-term average for several cycles. Thus the probability of another earthquake is path dependent, i.e. depends on the prior earthquake history over multiple cycles. Time histories resulting from simulations give clusters with properties similar to those observed. The sequences of earthquakes result from both the model parameters and chance, so two runs with the same parameters look different. The model parameters control the average time between events and the variation of the actual times around this average, so models can be strongly or weakly time-dependent.

  6. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  7. Associations between parenting, media use, cumulative risk, and children's executive functioning.

    PubMed

    Linebarger, Deborah L; Barr, Rachel; Lapierre, Matthew A; Piotrowski, Jessica T

    2014-01-01

    This study was designed to examine how parenting style, media exposure, and cumulative risk were associated with executive functioning (EF) during early childhood. A nationally representative group of US parents/caregivers (N = 1156) with 1 child between 2 and 8 years participated in a telephone survey. Parents were asked to report on their child's exposure to television, music, and book reading through a 24-hour time diary. Parents also reported a host of demographic and parenting variables as well as questions on their child's EF. Separate multiple regressions for preschool (2-5 years) and school-aged (6-8 years) children grouped by cumulative risk were conducted. Parenting style moderated the risks of exposure to background television on EF for high-risk preschool-age children. Educational TV exposure served as a buffer for high-risk school-aged children. Cumulative risk, age, and parenting quality interacted with a number of the exposure effects. The study showed a complex pattern of associations between cumulative risk, parenting, and media exposure with EF during early childhood. Consistent with the American Academy of Pediatrics, these findings support the recommendation that background television should be turned off when a child is in the room and suggest that exposure to high-quality content across multiple media platforms may be beneficial.

  8. Combined statistical analysis of landslide release and propagation

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Rohmaneo, Mohammad; Chu, Hone-Jay

    2016-04-01

    Statistical methods - often coupled with stochastic concepts - are commonly employed to relate areas affected by landslides with environmental layers, and to estimate spatial landslide probabilities by applying these relationships. However, such methods only concern the release of landslides, disregarding their motion. Conceptual models for mass flow routing are used for estimating landslide travel distances and possible impact areas. Automated approaches combining release and impact probabilities are rare. The present work attempts to fill this gap by a fully automated procedure combining statistical and stochastic elements, building on the open source GRASS GIS software: (1) The landslide inventory is subset into release and deposition zones. (2) We employ a traditional statistical approach to estimate the spatial release probability of landslides. (3) We back-calculate the probability distribution of the angle of reach of the observed landslides, employing the software tool r.randomwalk. One set of random walks is routed downslope from each pixel defined as release area. Each random walk stops when leaving the observed impact area of the landslide. (4) The cumulative probability function (cdf) derived in (3) is used as input to route a set of random walks downslope from each pixel in the study area through the DEM, assigning the probability gained from the cdf to each pixel along the path (impact probability). The impact probability of a pixel is defined as the average impact probability of all sets of random walks impacting a pixel. Further, the average release probabilities of the release pixels of all sets of random walks impacting a given pixel are stored along with the area of the possible release zone. (5) We compute the zonal release probability by increasing the release probability according to the size of the release zone - the larger the zone, the larger the probability that a landslide will originate from at least one pixel within this zone. We quantify this relationship by a set of empirical curves. (6) Finally, we multiply the zonal release probability with the impact probability in order to estimate the combined impact probability for each pixel. We demonstrate the model with a 167 km² study area in Taiwan, using an inventory of landslides triggered by the typhoon Morakot. Analyzing the model results leads us to a set of key conclusions: (i) The average composite impact probability over the entire study area corresponds well to the density of observed landside pixels. Therefore we conclude that the method is valid in general, even though the concept of the zonal release probability bears some conceptual issues that have to be kept in mind. (ii) The parameters used as predictors cannot fully explain the observed distribution of landslides. The size of the release zone influences the composite impact probability to a larger degree than the pixel-based release probability. (iii) The prediction rate increases considerably when excluding the largest, deep-seated, landslides from the analysis. We conclude that such landslides are mainly related to geological features hardly reflected in the predictor layers used.

  9. Nonparametric analysis of bivariate gap time with competing risks.

    PubMed

    Huang, Chiung-Yu; Wang, Chenguang; Wang, Mei-Cheng

    2016-09-01

    This article considers nonparametric methods for studying recurrent disease and death with competing risks. We first point out that comparisons based on the well-known cumulative incidence function can be confounded by different prevalence rates of the competing events, and that comparisons of the conditional distribution of the survival time given the failure event type are more relevant for investigating the prognosis of different patterns of recurrence disease. We then propose nonparametric estimators for the conditional cumulative incidence function as well as the conditional bivariate cumulative incidence function for the bivariate gap times, that is, the time to disease recurrence and the residual lifetime after recurrence. To quantify the association between the two gap times in the competing risks setting, a modified Kendall's tau statistic is proposed. The proposed estimators for the conditional bivariate cumulative incidence distribution and the association measure account for the induced dependent censoring for the second gap time. Uniform consistency and weak convergence of the proposed estimators are established. Hypothesis testing procedures for two-sample comparisons are discussed. Numerical simulation studies with practical sample sizes are conducted to evaluate the performance of the proposed nonparametric estimators and tests. An application to data from a pancreatic cancer study is presented to illustrate the methods developed in this article. © 2016, The International Biometric Society.

  10. Can anti-Mullerian hormone (AMH) predict the outcome of intrauterine insemination with controlled ovarian stimulation?

    PubMed

    Bakas, Panagiotis; Boutas, Ioannis; Creatsa, Maria; Vlahos, Nicos; Gregoriou, Odysseas; Creatsas, George; Hassiakos, Dimitrios

    2015-10-01

    To assess whether the levels of anti-Mullerian hormone (AMH) are related to outcome of intrauterine insemination (IUI) in patients treated with gonadotropins. A total of 195 patients underwent controlled ovarian stimulation (COS) with recombinant follicle stimulating hormone (rFSH) (50-150 IU/d). All patients were submitted upto three cycles of IUI. Primary outcome was the ability of AMH levels to predict clinical pregnancy at first attempt and the cumulative clinical pregnancy probability of upto three IUI cycles. Secondary outcomes were the relation of AMH, LH, FSH, BMI, age, parity and basic estradiol levels with each other and the outcome of IUI. The area under the receiver operating characteristic (ROC) curve in predicting clinical pregnancy for AMH at first attempt was 0.53 and for cumulative clinical pregnancy was 0.76. AMH levels were positively correlated with clinical pregnancy rate at first attempt and with cumulative clinical pregnancy rate, but negatively correlated with patient's age and FSH levels. Patient's FSH, LH levels were negatively correlated with cumulative clinical pregnancy rate. AMH levels seem to have a positive correlation and patient's age and LH levels had a negative correlation with the outcome of IUI and COS with gonadotropins. AMH concentration was significantly higher and LH was significantly lower in patients with a clinical pregnancy after three cycles of IUI treatment compared with those who did not achieve pregnancy.

  11. Cost-effectiveness analysis of liver resection versus transplantation for early hepatocellular carcinoma within the Milan criteria.

    PubMed

    Lim, Kheng Choon; Wang, Vivian W; Siddiqui, Fahad J; Shi, Luming; Chan, Edwin S Y; Oh, Hong Choon; Tan, Say Beng; Chow, Pierce K H

    2015-01-01

    Both liver resection (LR) and cadaveric liver transplantation (CLT) are potentially curative treatments for patients with hepatocellular carcinoma (HCC) within the Milan criteria and with adequate liver function. Adopting either as a first-line therapy carries major cost and resource implications. The objective of this study was to estimate the relative cost-effectiveness of LR against CLT for patients with HCC within the Milan criteria using a decision analytic model. A Markov cohort model was developed to simulate a cohort of patients aged 55 years with HCC within the Milan criteria and Child-Pugh A/B cirrhosis, undergoing LR or CLT, and followed up over their remaining life expectancy. Analysis was performed in different geographical cost settings: the USA, Switzerland and Singapore. Transition probabilities were obtained from systematic literature reviews, supplemented by databases from Singapore and the Organ Procurement and Transplantation Network (USA). Utility and cost data were obtained from open sources. LR produced 3.9 quality-adjusted life years (QALYs) while CLT had an additional 1.4 QALYs. The incremental cost-effectiveness ratio (ICER) of CLT versus LR ranged from $111,821/QALY in Singapore to $156,300/QALY in Switzerland, and was above thresholds for cost-effectiveness in all three countries. Sensitivity analysis revealed that CLT-related 5-year cumulative survival, one-time cost of CLT, and post-LR 5-year cumulative recurrence rates were the most sensitive parameters in all cost scenarios. ICERs were reduced below threshold when CLT-related 5-year cumulative survival exceeded 84.9% and 87.6% in Singapore and the USA, respectively. For Switzerland, the ICER remained above the cost-effectiveness threshold regardless of the variations. In patients with HCC within the Milan criteria and Child-Pugh A/B cirrhosis, LR is more cost-effective than CLT across three different costing scenarios: the USA, Switzerland, Singapore. © 2014 by the American Association for the Study of Liver Diseases.

  12. Age-dependent associations between androgenetic alopecia and prostate cancer risk.

    PubMed

    Muller, David C; Giles, Graham G; Sinclair, Rod; Hopper, John L; English, Dallas R; Severi, Gianluca

    2013-02-01

    Both prostate cancer and androgenetic alopecia are strongly age-related conditions that are considered to be androgen dependent, but studies of the relationship between them have yielded inconsistent results. We aimed to assess whether androgenetic alopecia at ages 20 and 40 years are associated with risk of prostate cancer. At a follow-up of the Melbourne Collaborative Cohort Study, men were asked to assess their hair pattern at ages 20 and 40 years relative to eight categories in showcards. Cases were men notified to the Victorian Cancer Registry with prostate cancer diagnosed between cohort enrollment (1990-1994) and follow-up attendance (2003-2009). Flexible parametric survival models were used to estimate age-varying HRs and predicted cumulative probabilities of prostate cancer by androgenetic alopecia categories. Of 9,448 men that attended follow-up and provided data on androgenetic alopecia, we identified 476 prostate cancer cases during a median follow-up of 11 years four months. Cumulative probability of prostate cancer was greater at all ages up to 76 years, for men with vertex versus no androgenetic alopecia at age of 40 years. At age of 76 years, the estimated probabilities converged to 0.15. Vertex androgenetic alopecia at 40 years was also associated with younger age of diagnosis for prostate cancer cases. Vertex androgenetic alopecia at age of 40 years might be a marker of increased risk of early-onset prostate cancer. If confirmed, these results suggest that the apparently conflicting findings of previous studies might be explained by failure to adequately model the age-varying nature of the association between androgenetic alopecia and prostate cancer.

  13. Rock size-frequency distributions on Mars and implications for Mars Exploration Rover landing safety and operations

    NASA Astrophysics Data System (ADS)

    Golombek, M. P.; Haldemann, A. F. C.; Forsberg-Taylor, N. K.; DiMaggio, E. N.; Schroeder, R. D.; Jakosky, B. M.; Mellon, M. T.; Matijevic, J. R.

    2003-10-01

    The cumulative fractional area covered by rocks versus diameter measured at the Pathfinder site was predicted by a rock distribution model that follows simple exponential functions that approach the total measured rock abundance (19%), with a steep decrease in rocks with increasing diameter. The distribution of rocks >1.5 m diameter visible in rare boulder fields also follows this steep decrease with increasing diameter. The effective thermal inertia of rock populations calculated from a simple empirical model of the effective inertia of rocks versus diameter shows that most natural rock populations have cumulative effective thermal inertias of 1700-2100 J m-2 s-0.5 K-1 and are consistent with the model rock distributions applied to total rock abundance estimates. The Mars Exploration Rover (MER) airbags have been successfully tested against extreme rock distributions with a higher percentage of potentially hazardous triangular buried rocks than observed at the Pathfinder and Viking landing sites. The probability of the lander impacting a >1 m diameter rock in the first 2 bounces is <3% and <5% for the Meridiani and Gusev landing sites, respectively, and is <0.14% and <0.03% for rocks >1.5 m and >2 m diameter, respectively. Finally, the model rock size-frequency distributions indicate that rocks >0.1 m and >0.3 m in diameter, large enough to place contact sensor instruments against and abrade, respectively, should be plentiful within a single sol's drive at the Meridiani and Gusev landing sites.

  14. Comparison of cumulant expansion and q-space imaging estimates for diffusional kurtosis in brain.

    PubMed

    Mohanty, Vaibhav; McKinnon, Emilie T; Helpern, Joseph A; Jensen, Jens H

    2018-05-01

    To compare estimates for the diffusional kurtosis in brain as obtained from a cumulant expansion (CE) of the diffusion MRI (dMRI) signal and from q-space (QS) imaging. For the CE estimates of the kurtosis, the CE was truncated to quadratic order in the b-value and fit to the dMRI signal for b-values from 0 up to 2000s/mm 2 . For the QS estimates, b-values ranging from 0 up to 10,000s/mm 2 were used to determine the diffusion displacement probability density function (dPDF) via Stejskal's formula. The kurtosis was then calculated directly from the second and fourth order moments of the dPDF. These two approximations were studied for in vivo human data obtained on a 3T MRI scanner using three orthogonal diffusion encoding directions. The whole brain mean values for the CE and QS kurtosis estimates differed by 16% or less in each of the considered diffusion encoding directions, and the Pearson correlation coefficients all exceeded 0.85. Nonetheless, there were large discrepancies in many voxels, particularly those with either very high or very low kurtoses relative to the mean values. Estimates of the diffusional kurtosis in brain obtained using CE and QS approximations are strongly correlated, suggesting that they encode similar information. However, for the choice of b-values employed here, there may be substantial differences, depending on the properties of the diffusion microenvironment in each voxel. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Quantifying Diffuse Contamination: Method and Application to Pb in Soil.

    PubMed

    Fabian, Karl; Reimann, Clemens; de Caritat, Patrice

    2017-06-20

    A new method for detecting and quantifying diffuse contamination at the continental to regional scale is based on the analysis of cumulative distribution functions (CDFs). It uses cumulative probability (CP) plots for spatially representative data sets, preferably containing >1000 determinations. Simulations demonstrate how different types of contamination influence elemental CDFs of different sample media. It is found that diffuse contamination is characterized by a distinctive shift of the low-concentration end of the distribution of the studied element in its CP plot. Diffuse contamination can be detected and quantified via either (1) comparing the distribution of the contaminating element to that of an element with a geochemically comparable behavior but no contamination source (e.g., Pb vs Rb), or (2) comparing the top soil distribution of an element to the distribution of the same element in subsoil samples from the same area, taking soil forming processes into consideration. Both procedures are demonstrated for geochemical soil data sets from Europe, Australia, and the U.S.A. Several different data sets from Europe deliver comparable results at different scales. Diffuse Pb contamination in surface soil is estimated to be <0.5 mg/kg for Australia, 1-3 mg/kg for Europe, and 1-2 mg/kg, or at least <5 mg/kg, for the U.S.A. The analysis presented here also allows recognition of local contamination sources and can be used to efficiently monitor diffuse contamination at the continental to regional scale.

  16. Glaubers Ising chain between two thermostats

    NASA Astrophysics Data System (ADS)

    Cornu, F.; Hilhorst, H. J.

    2017-04-01

    We consider a one-dimensional Ising model with N spins, each in contact with two thermostats of distinct temperatures, T 1 and T 2. Under Glauber dynamics the stationary state happens to coincide with the equilibrium state at an effective intermediate temperature T≤ft({{T}1},{{T}2}\\right) . The system nevertheless carries a nontrivial energy current between the thermostats. By means of the fermionization technique, for a chain initially in equilibrium at an arbitrary temperature T 0 we calculate the Fourier transform of the probability P≤ft(Q;τ \\right) for the time-integrated energy current Q during a finite time interval τ. In the long time limit we determine the corresponding generating function for the cumulants per site and unit of time, {< {{Q}n}>\\text{c}}/(Nτ ) , and explicitly give those with n  =  1, 2, 3, 4. We exhibit various phenomena in specific regimes: kinetic mean-field effects when one thermostat flips any spin less often than the other one, as well as dissipation towards a thermostat at zero temperature. Moreover, when the system size N goes to infinity while the effective temperature T vanishes, the cumulants of Q per unit of time grow linearly with N and are equal to those of a random walk process. In two adequate scaling regimes involving T and N we exhibit the dependence of the first correction upon the ratio of the spin-spin correlation length ξ (T) and the size N.

  17. Interactive Contributions of Cumulative Peer Stress and Executive Function Deficits to Depression in Early Adolescence

    ERIC Educational Resources Information Center

    Agoston, Anna M.; Rudolph, Karen D.

    2016-01-01

    Exposure to peer stress contributes to adolescent depression, yet not all youth experience these effects. Thus, it is important to identify individual differences that shape the consequences of peer stress. This research investigated the interactive contribution of cumulative peer stress during childhood (second-fifth grades) and executive…

  18. Genetic algorithm-based improved DOA estimation using fourth-order cumulants

    NASA Astrophysics Data System (ADS)

    Ahmed, Ammar; Tufail, Muhammad

    2017-05-01

    Genetic algorithm (GA)-based direction of arrival (DOA) estimation is proposed using fourth-order cumulants (FOC) and ESPRIT principle which results in Multiple Invariance Cumulant ESPRIT algorithm. In the existing FOC ESPRIT formulations, only one invariance is utilised to estimate DOAs. The unused multiple invariances (MIs) must be exploited simultaneously in order to improve the estimation accuracy. In this paper, a fitness function based on a carefully designed cumulant matrix is developed which incorporates MIs present in the sensor array. Better DOA estimation can be achieved by minimising this fitness function. Moreover, the effectiveness of Newton's method as well as GA for this optimisation problem has been illustrated. Simulation results show that the proposed algorithm provides improved estimation accuracy compared to existing algorithms, especially in the case of low SNR, less number of snapshots, closely spaced sources and high signal and noise correlation. Moreover, it is observed that the optimisation using Newton's method is more likely to converge to false local optima resulting in erroneous results. However, GA-based optimisation has been found attractive due to its global optimisation capability.

  19. Mid-21st- century climate changes increase predicted fire occurrence and fire season length, Northern Rocky Mountains, United States

    Treesearch

    Karin L. Riley; Rachel A. Loehman

    2016-01-01

    Climate changes are expected to increase fire frequency, fire season length, and cumulative area burned in the western United States. We focus on the potential impact of mid-21st- century climate changes on annual burn probability, fire season length, and large fire characteristics including number and size for a study area in the Northern Rocky Mountains....

  20. A Case Series of the Probability Density and Cumulative Distribution of Laryngeal Disease in a Tertiary Care Voice Center.

    PubMed

    de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander

    2017-11-01

    To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.

  1. Cumulative exposure to work-related traumatic events and current post-traumatic stress disorder in New York City's first responders.

    PubMed

    Geronazzo-Alman, Lupo; Eisenberg, Ruth; Shen, Sa; Duarte, Cristiane S; Musa, George J; Wicks, Judith; Fan, Bin; Doan, Thao; Guffanti, Guia; Bresnahan, Michaeline; Hoven, Christina W

    2017-04-01

    Cumulative exposure to work-related traumatic events (CE) is a foreseeable risk for psychiatric disorders in first responders (FRs). Our objective was to examine the impact of work-related CE that could serve as predictor of posttraumatic stress disorder (PTSD) and/or depression in FRs. Cross-sectional examination of previous CE and past-month PTSD outcomes and depression in 209 FRs. Logistic (probable PTSD; probable depression) and Poisson regressions (PTSD score) of the outcomes on work-related CE indexes, adjusting for demographic variables. Differences across occupational groups were also examined. Receiver operating characteristic analysis determined the sensitivity and specificity of CE indexes. All indexes were significantly and differently associated with PTSD; associations with depression were non-significant. The index capturing the sheer number of different incidents experienced regardless of frequency ('Variety') showed conceptual, practical and statistical advantages compared to other indexes. In general, the indexes showed poor to fair discrimination accuracy. Work-related CE is specifically associated with PTSD. Focusing on the variety of exposures may be a simple and effective strategy to predict PTSD in FRs. Further research on sensitivity and specificity of exposure indexes, preferably examined prospectively, is needed and could lead to early identification of individuals at risk. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Developing a new nonbinary SNP fluorescent multiplex detection system for forensic application in China.

    PubMed

    Liu, Yanfang; Liao, Huidan; Liu, Ying; Guo, Juanjuan; Sun, Yi; Fu, Xiaoliang; Xiao, Ding; Cai, Jifeng; Lan, Lingmei; Xie, Pingli; Zha, Lagabaiyila

    2017-04-01

    Nonbinary single-nucleotide polymorphisms (SNPs) are potential forensic genetic markers because their discrimination power is greater than that of normal binary SNPs, and that they can detect highly degraded samples. We previously developed a nonbinary SNP multiplex typing assay. In this study, we selected additional 20 nonbinary SNPs from the NCBI SNP database and verified them through pyrosequencing. These 20 nonbinary SNPs were analyzed using the fluorescent-labeled SNaPshot multiplex SNP typing method. The allele frequencies and genetic parameters of these 20 nonbinary SNPs were determined among 314 unrelated individuals from Han populations from China. The total power of discrimination was 0.9999999999994, and the cumulative probability of exclusion was 0.9986. Moreover, the result of the combination of this 20 nonbinary SNP assay with the 20 nonbinary SNP assay we previously developed demonstrated that the cumulative probability of exclusion of the 40 nonbinary SNPs was 0.999991 and that no significant linkage disequilibrium was observed in all 40 nonbinary SNPs. Thus, we concluded that this new system consisting of new 20 nonbinary SNPs could provide highly informative polymorphic data which would be further used in forensic application and would serve as a potentially valuable supplement to forensic DNA analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Nanosecond multiple pulse measurements and the different types of defects

    NASA Astrophysics Data System (ADS)

    Wagner, Frank R.; Natoli, Jean-Yves; Beaudier, Alexandre; Commandré, Mireille

    2017-11-01

    Laser damage measurements with multiple pulses at constant fluence (S-on-1 measurements) are of high practical importance for design and validation of high power photonic instruments. Using nanosecond lasers, it has been recognized long ago that single pulse laser damage is linked to fabrication related defects. Models describing the laser damage probability as the probability of encounter between the high fluence region of the laser beam and the fabrication related defects are thus widely used to analyze the measurements. Nanosecond S-on-1 tests often reveal the "fatigue effect", i.e. a decrease of the laser damage threshold with increasing pulse number. Most authors attribute this effect to cumulative material modifications operated by the first pulses. In this paper we discuss the different situations that are observed upon nanosecond S-on-1 measurements of several different materials using different wavelengths and speak in particular about the defects involved in the laser damage mechanism. These defects may be fabrication-related or laser-induced, stable or evolutive, cumulative or of short lifetime. We will show that the type of defect that is dominating an S-on-1 experiment depends on the wavelength and the material under test and give examples from measurements of nonlinear optical crystals, fused silica and oxide mixture coatings.

  4. Stochastic seismic inversion based on an improved local gradual deformation method

    NASA Astrophysics Data System (ADS)

    Yang, Xiuwei; Zhu, Peimin

    2017-12-01

    A new stochastic seismic inversion method based on the local gradual deformation method is proposed, which can incorporate seismic data, well data, geology and their spatial correlations into the inversion process. Geological information, such as sedimentary facies and structures, could provide significant a priori information to constrain an inversion and arrive at reasonable solutions. The local a priori conditional cumulative distributions at each node of model to be inverted are first established by indicator cokriging, which integrates well data as hard data and geological information as soft data. Probability field simulation is used to simulate different realizations consistent with the spatial correlations and local conditional cumulative distributions. The corresponding probability field is generated by the fast Fourier transform moving average method. Then, optimization is performed to match the seismic data via an improved local gradual deformation method. Two improved strategies are proposed to be suitable for seismic inversion. The first strategy is that we select and update local areas of bad fitting between synthetic seismic data and real seismic data. The second one is that we divide each seismic trace into several parts and obtain the optimal parameters for each part individually. The applications to a synthetic example and a real case study demonstrate that our approach can effectively find fine-scale acoustic impedance models and provide uncertainty estimations.

  5. Risk of false decision on conformity of a multicomponent material when test results of the components' content are correlated.

    PubMed

    Kuselman, Ilya; Pennecchi, Francesca R; da Silva, Ricardo J N B; Hibbert, D Brynn

    2017-11-01

    The probability of a false decision on conformity of a multicomponent material due to measurement uncertainty is discussed when test results are correlated. Specification limits of the components' content of such a material generate a multivariate specification interval/domain. When true values of components' content and corresponding test results are modelled by multivariate distributions (e.g. by multivariate normal distributions), a total global risk of a false decision on the material conformity can be evaluated based on calculation of integrals of their joint probability density function. No transformation of the raw data is required for that. A total specific risk can be evaluated as the joint posterior cumulative function of true values of a specific batch or lot lying outside the multivariate specification domain, when the vector of test results, obtained for the lot, is inside this domain. It was shown, using a case study of four components under control in a drug, that the correlation influence on the risk value is not easily predictable. To assess this influence, the evaluated total risk values were compared with those calculated for independent test results and also with those assuming much stronger correlation than that observed. While the observed statistically significant correlation did not lead to a visible difference in the total risk values in comparison to the independent test results, the stronger correlation among the variables caused either the total risk decreasing or its increasing, depending on the actual values of the test results. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. New insights into galaxy structure from GALPHAT- I. Motivation, methodology and benchmarks for Sérsic models

    NASA Astrophysics Data System (ADS)

    Yoon, Ilsang; Weinberg, Martin D.; Katz, Neal

    2011-06-01

    We introduce a new galaxy image decomposition tool, GALPHAT (GALaxy PHotometric ATtributes), which is a front-end application of the Bayesian Inference Engine (BIE), a parallel Markov chain Monte Carlo package, to provide full posterior probability distributions and reliable confidence intervals for all model parameters. The BIE relies on GALPHAT to compute the likelihood function. GALPHAT generates scale-free cumulative image tables for the desired model family with precise error control. Interpolation of this table yields accurate pixellated images with any centre, scale and inclination angle. GALPHAT then rotates the image by position angle using a Fourier shift theorem, yielding high-speed, accurate likelihood computation. We benchmark this approach using an ensemble of simulated Sérsic model galaxies over a wide range of observational conditions: the signal-to-noise ratio S/N, the ratio of galaxy size to the point spread function (PSF) and the image size, and errors in the assumed PSF; and a range of structural parameters: the half-light radius re and the Sérsic index n. We characterize the strength of parameter covariance in the Sérsic model, which increases with S/N and n, and the results strongly motivate the need for the full posterior probability distribution in galaxy morphology analyses and later inferences. The test results for simulated galaxies successfully demonstrate that, with a careful choice of Markov chain Monte Carlo algorithms and fast model image generation, GALPHAT is a powerful analysis tool for reliably inferring morphological parameters from a large ensemble of galaxies over a wide range of different observational conditions.

  7. Do observed or perceived characteristics of the neighborhood environment mediate associations between neighborhood poverty and cumulative biological risk?

    PubMed Central

    Schulz, Amy J.; Mentz, Graciela; Lachance, Laurie; Zenk, Shannon N.; Johnson, Jonetta; Stokes, Carmen; Mandell, Rebecca

    2013-01-01

    Objective To examine contributions of observed and perceived neighborhood characteristics in explaining associations between neighborhood poverty and cumulative biological risk (CBR) in an urban community. Methods Multilevel regression analyses were conducted using cross-sectional data from a probability sample survey (n=919), and observational and census data. Dependent variable: CBR. Independent variables: Neighborhood disorder, deterioration and characteristics; perceived neighborhood social environment, physical environment, and neighborhood environment. Covariates: Neighborhood and individual demographics, health-related behaviors. Results Observed and perceived indicators of neighborhood conditions were significantly associated with CBR, after accounting for both neighborhood and individual level socioeconomic indicators. Observed and perceived neighborhood environmental conditions mediated associations between neighborhood poverty and CBR. Conclusions Findings were consistent with the hypothesis that neighborhood conditions associated with economic divestment mediate associations between neighborhood poverty and CBR. PMID:24100238

  8. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    NASA Astrophysics Data System (ADS)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based global temperature anomaly data published by NASS GISS. Typical climate contracts predict the probability of a specified future temperature, but not the probability density or best estimate. One way to generate a probability distribution would be to create a family of contracts over a range of specified temperatures and interpret the price of each contract as its exceedance probability. The resulting plot of probability vs. anomaly is the market-based cumulative density function. The best estimate can be determined by interpolation, and the market-based uncertainty estimate can be based on the spread. One requirement for an effective prediction market is liquidity. Climate contracts are currently considered somewhat of a novelty and often lack sufficient liquidity, but climate change has the potential to generate both tremendous losses for some (e.g. agricultural collapse and extreme weather events) and wealth for others (access to natural resources and trading routes). Use of climate markets by large stakeholders has the potential to generate the liquidity necessary to make them viable. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.

  9. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    NASA Astrophysics Data System (ADS)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to 2008, while the intensity of such flow extremes is comparatively increasing especially for the higher return levels.

  10. Cumulative evidence for relationships between multiple variants in the VTI1A and TCF7L2 genes and cancer incidence.

    PubMed

    Zhang, Min; Tang, Mingshuang; Fang, Yanfei; Cui, Huijie; Chen, Siyu; Li, Junlong; Xiong, Hongyan; Lu, Jiachun; Gu, Dongqing; Zhang, Ben

    2018-02-01

    Genetic studies have linked the VTI1A-TCF7L2 region with risk of multiple cancers. However, findings from these studies were generally inconclusive. We aimed to provide a synopsis of current understanding of associations between variants in the VTI1A-TCF7L2 region and cancer susceptibility. We conducted a comprehensive research synopsis and meta-analysis to evaluate associations between 17 variants in this region and risk of seven cancers using data from 32 eligible articles totaling 224,656 cancer cases and 324,845 controls. We graded cumulative evidence of significant associations using Venice criteria and false-positive report probability tests. We also conducted analyses to evaluate potential function of these variants using data from the Encyclopedia of DNA Elements (ENCODE) Project. Eight variants showed a nominally significant association with risk of individual cancer (p < 0.05). Cumulative epidemiological evidence of an association was graded as strong for rs7903146 [odds ratio (OR) = 1.05, p = 4.13 × 10 -5 ] and rs7904519 (OR = 1.07, p = 2.02 × 10 -14 ) in breast cancer, rs11196172 (OR = 1.11, p = 2.22 × 10 -16 ), rs12241008 (OR = 1.13, p = 1.36 × 10 -10 ) and rs10506868 (OR = 1.10, p = 3.98 × 10 -9 ) in colorectal cancer, rs7086803 in lung cancer (OR = 1.30, p = 3.54 × 10 -18 ) and rs11196067 (OR = 1.18, p = 3.59 × 10 -13 ) in glioma, moderate for rs12255372 (OR = 1.12, p = 2.52 × 10 -4 ) in breast cancer and weak for rs7903146 (OR = 1.11, p = 0.007) in colorectal cancer. Data from ENCODE suggested that seven variants with strong evidence and other correlated variants might fall within putative functional regions. Collectively, our study provides summary evidence that common variants in the VTI1A and TCF7L2 genes are associated with risk of breast, colorectal, lung cancer and glioma and highlights the significant role of the VTI1A-TCF7L2 region in the pathogenesis of human cancers. © 2017 UICC.

  11. A balanced solution to the cumulative threat of industrialized wind farm development on cinereous vultures (Aegypius monachus) in south-eastern Europe.

    PubMed

    Vasilakis, Dimitris P; Whitfield, D Philip; Kati, Vassiliki

    2017-01-01

    Wind farm development can combat climate change but may also threaten bird populations' persistence through collision with wind turbine blades if such development is improperly planned strategically and cumulatively. Such improper planning may often occur. Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM) to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM. Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms) than currently (operating wind farms), equivalent to 44% of the current population (103 individuals) if all proposals are authorized (2744 MW). Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW), cumulative collision mortality would still be high (17% of current population) and likely lead to population extinction. Under any wind farm proposal scenario, over 92% of expected deaths would occur in the core area of the population, further implying inadequate spatial planning and implementation of relevant European legislation with scant regard for governmental obligations to protect key species. On the basis of a sensitivity map we derive a spatially explicit solution that could meet the national target of wind harnessing with a minimum conservation cost of less than 1% population loss providing that the population mortality (5.2%) caused by the operating wind farms in the core area would be totally mitigated. Under other scenarios, the vulture population would probably be at serious risk of extinction. Our 'win-win' approach is appropriate to other potential conflicts where wind farms may cumulatively threaten wildlife populations.

  12. A balanced solution to the cumulative threat of industrialized wind farm development on cinereous vultures (Aegypius monachus) in south-eastern Europe

    PubMed Central

    Whitfield, D. Philip; Kati, Vassiliki

    2017-01-01

    Wind farm development can combat climate change but may also threaten bird populations’ persistence through collision with wind turbine blades if such development is improperly planned strategically and cumulatively. Such improper planning may often occur. Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM) to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM. Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms) than currently (operating wind farms), equivalent to 44% of the current population (103 individuals) if all proposals are authorized (2744 MW). Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW), cumulative collision mortality would still be high (17% of current population) and likely lead to population extinction. Under any wind farm proposal scenario, over 92% of expected deaths would occur in the core area of the population, further implying inadequate spatial planning and implementation of relevant European legislation with scant regard for governmental obligations to protect key species. On the basis of a sensitivity map we derive a spatially explicit solution that could meet the national target of wind harnessing with a minimum conservation cost of less than 1% population loss providing that the population mortality (5.2%) caused by the operating wind farms in the core area would be totally mitigated. Under other scenarios, the vulture population would probably be at serious risk of extinction. Our ‘win-win’ approach is appropriate to other potential conflicts where wind farms may cumulatively threaten wildlife populations. PMID:28231316

  13. Long-Term Trends in Glaucoma-Related Blindness in Olmsted County, Minnesota

    PubMed Central

    Malihi, Mehrdad; Moura Filho, Edney R.; Hodge, David O.; Sit, Arthur J.

    2013-01-01

    Objective To determine the longitudinal trends in the probability of blindness due to open-angle glaucoma (OAG) in Olmsted County, Minnesota from 1965 to 2009. Design Retrospective, population-based cohort study. Participants All residents of Olmsted County, Minnesota (40 years of age and over) who were diagnosed with OAG between January 1, 1965 to December 31, 2000. Methods All available medical records of every incident case of OAG were reviewed until December 31, 2009 to identify progression to blindness, defined as visual acuity of 20/200 or worse, and/or visual field constriction to 20° or less. Kaplan–Meier analysis was used to estimate the cumulative probability of glaucoma-related blindness. Population incidence of blindness within 10 years of diagnosis was calculated using United States Census data. Rates for subjects diagnosed in the period 1965–1980 were compared with rates for subjects diagnosed in the period 1981–2000 using logrank tests and Poisson regression models. Main Outcome Measures Cumulative probability of OAG-related blindness, and population incidence of blindness within 10 years of diagnosis. Results Probability of glaucoma-related blindness in at least one eye at 20 years decreased from 25.8 % (95% Confidence interval [CI]: 18.5–32.5) for subjects diagnosed in 1965–1980, to 13.5% (95% CI: 8.8–17.9) for subjects diagnosed in 1981–2000 (P=0.01). The population incidence of blindness within 10 years of the diagnosis decreased from 8.7 per 100,000 (95% CI: 5.9–11.5) for subjects diagnosed in 1965–1980, to 5.5 per 100,000 (95% CI: 3.9–7.2) for subjects diagnosed in 1981–2000 (P=0.02). Higher age at diagnosis was associated with increased risk of progression to blindness (P< 0.001). Conclusions The 20-year probability and the population incidence of blindness due to OAG in at least one eye have decreased over a 45 year period from 1965 to 2009. However, a significant proportion of patients still progress to blindness despite recent diagnostic and therapeutic advancements. PMID:24823760

  14. Probability of Achieving Glycemic Control with Basal Insulin in Patients with Type 2 Diabetes in Real-World Practice in the USA.

    PubMed

    Blonde, Lawrence; Meneghini, Luigi; Peng, Xuejun Victor; Boss, Anders; Rhee, Kyu; Shaunik, Alka; Kumar, Supriya; Balodi, Sidhartha; Brulle-Wohlhueter, Claire; McCrimmon, Rory J

    2018-06-01

    Basal insulin (BI) plays an important role in treating type 2 diabetes (T2D), especially when oral antidiabetic (OAD) medications are insufficient for glycemic control. We conducted a retrospective, observational study using electronic medical records (EMR) data from the IBM ® Explorys database to evaluate the probability of achieving glycemic control over 24 months after BI initiation in patients with T2D in the USA. A cohort of 6597 patients with T2D who started BI following OAD(s) and had at least one valid glycated hemoglobin (HbA1c) result recorded both within 90 days before and 720 days after BI initiation were selected. We estimated the changes from baseline in HbA1c every 6 months, the quarterly conditional probabilities of reaching HbA1c < 7% if a patient had not achieved glycemic control prior to each quarter (Q), and the cumulative probability of reaching glycemic control over 24 months. Our cohort was representative of patients with T2D who initiated BI from OADs in the USA. The average HbA1c was 9.1% at BI initiation, and decreased robustly (1.5%) in the first 6 months after initiation with no further reductions thereafter. The conditional probability of reaching glycemic control decreased rapidly in the first year (26.6% in Q2; 17.6% in Q3; 8.6% in Q4), and then remained low (≤ 6.1%) for each quarter in the second year. Cumulatively, about 38% of patients reached HbA1c < 7% in the first year; only approximately 8% more did so in the second year. Our study of real-world data from a large US EMR database suggested that among patients with T2D who initiated BI after OADs, the likelihood of reaching glycemic control diminished over time, and remained low from 12 months onwards. Additional treatment options should be considered if patients do not reach glycemic control within 12 months of BI initiation. Sanofi Corporation.

  15. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  16. Cumulative positive fluid balance is a risk factor for acute kidney injury and requirement for renal replacement therapy after liver transplantation.

    PubMed

    Codes, Liana; de Souza, Ygor Gomes; D'Oliveira, Ricardo Azevedo Cruz; Bastos, Jorge Luiz Andrade; Bittencourt, Paulo Lisboa

    2018-04-24

    To analyze whether fluid overload is an independent risk factor of adverse outcomes after liver transplantation (LT). One hundred and twenty-one patients submitted to LT were retrospectively evaluated. Data regarding perioperative and postoperative variables previously associated with adverse outcomes after LT were reviewed. Cumulative fluid balance (FB) in the first 12 h and 4 d after surgery were compared with major adverse outcomes after LT. Most of the patients were submitted to a liberal approach of fluid administration with a mean cumulative FB over 5 L and 10 L, respectively, in the first 12 h and 4 d after LT. Cumulative FB in 4 d was independently associated with occurrence of both AKI and requirement for renal replacement therapy (RRT) (OR = 2.3; 95%CI: 1.37-3.86, P = 0.02 and OR = 2.89; 95%CI: 1.52-5.49, P = 0.001 respectively). Other variables on multivariate analysis associated with AKI and RRT were, respectively, male sex and Acute Physiology and Chronic Health Disease Classification System (APACHE II) levels and sepsis or septic shock. Mortality was shown to be independently related to AST and APACHE II levels (OR = 2.35; 95%CI: 1.1-5.05, P = 0.02 and 2.63; 95%CI: 1.0-6.87, P = 0.04 respectively), probably reflecting the degree of graft dysfunction and severity of early postoperative course of LT. No effect of FB on mortality after LT was disclosed. Cumulative positive FB over 4 d after LT is independently associated with the development of AKI and the requirement of RRT. Survival was not independently related to FB, but to surrogate markers of graft dysfunction and severity of postoperative course of LT.

  17. Cumulative watershed effects: a research perspective

    Treesearch

    Leslie M. Reid; Robert R. Ziemer

    1989-01-01

    A cumulative watershed effect (CWE) is any response to multiple land-use activities that is caused by, or results in, altered watershed function. The CWE issue is politically defined, as is the significance of particular impacts. But the processes generating CWEs are the traditional focus of geomorphology and ecology, and have thus been studied for decades. The CWE...

  18. Cumulative childhood risk and adult functioning in abused and neglected children grown up

    PubMed Central

    HORAN, JACQUELINE M.; WIDOM, CATHY SPATZ

    2017-01-01

    This paper examines the relationship between childhood exposure to cumulative risk and three indicators of psychosocial adjustment in adulthood (educational attainment, mental health, and criminal behavior) and tests three different models (linear, quadratic, and interaction). Data were collected over several time points from individuals who were part of a prospective cohort design study that matched children with documented cases of abuse and/or neglect with children without such histories and followed them into adulthood. Hierarchical multiple regressions compared linear and quadratic models and then examined potential moderating effects of child abuse/neglect and gender. Exposure to a greater number of childhood risk factors was significantly related to fewer years of education, more anxiety and depression symptomatology, and more criminal arrests in adulthood. The relationship between cumulative risk and years of education demonstrated a curvilinear pattern, whereas the relationship between cumulative risk and both mental health and criminal arrests was linear. Child abuse/neglect did not moderate these relationships, although there were direct effects for both child abuse/neglect and gender on criminal arrests, with more arrests for abused/neglected individuals than controls and more for males than females. Gender interacted with cumulative risk to impact educational attainment and criminal behavior, suggesting that interventions may be more effective if tailored differently for males and females. Interventions may need to be multifaceted and designed to address these different domains of functioning. PMID:25196178

  19. Cumulative childhood risk and adult functioning in abused and neglected children grown up.

    PubMed

    Horan, Jacqueline M; Widom, Cathy Spatz

    2015-08-01

    This paper examines the relationship between childhood exposure to cumulative risk and three indicators of psychosocial adjustment in adulthood (educational attainment, mental health, and criminal behavior) and tests three different models (linear, quadratic, and interaction). Data were collected over several time points from individuals who were part of a prospective cohort design study that matched children with documented cases of abuse and/or neglect with children without such histories and followed them into adulthood. Hierarchical multiple regressions compared linear and quadratic models and then examined potential moderating effects of child abuse/neglect and gender. Exposure to a greater number of childhood risk factors was significantly related to fewer years of education, more anxiety and depression symptomatology, and more criminal arrests in adulthood. The relationship between cumulative risk and years of education demonstrated a curvilinear pattern, whereas the relationship between cumulative risk and both mental health and criminal arrests was linear. Child abuse/neglect did not moderate these relationships, although there were direct effects for both child abuse/neglect and gender on criminal arrests, with more arrests for abused/neglected individuals than controls and more for males than females. Gender interacted with cumulative risk to impact educational attainment and criminal behavior, suggesting that interventions may be more effective if tailored differently for males and females. Interventions may need to be multifaceted and designed to address these different domains of functioning.

  20. Cumulative effects of mothers' risk and promotive factors on daughters' disruptive behavior.

    PubMed

    van der Molen, Elsa; Hipwell, Alison E; Vermeiren, Robert; Loeber, Rolf

    2012-07-01

    Little is known about the ways in which the accumulation of maternal factors increases or reduces risk for girls' disruptive behavior during preadolescence. In the current study, maternal risk and promotive factors and the severity of girls' disruptive behavior were assessed annually among girls' ages 7-12 in an urban community sample (N = 2043). Maternal risk and promotive factors were operative at different time points in girls' development. Maternal warmth explained variance in girls' disruptive behavior, even after controlling for maternal risk factors and relevant child and neighborhood factors. In addition, findings supported the cumulative hypothesis that the number of risk factors increased the chance on girls' disruptive behavior disorder (DBD), while the number of promotive factors decreased this probability. Daughters of mothers with a history of Conduct Disorder (CD) were exposed to more risk factors and fewer promotive factors compared to daughters of mothers without prior CD. The identification of malleable maternal factors that can serve as targets for intervention has important implications for intergenerational intervention. Cumulative effects show that the focus of prevention efforts should not be on single factors, but on multiple factors associated with girls' disruptive behavior.

  1. Cumulative Effects of Mothers’ Risk and Promotive Factors on Daughters’ Disruptive Behavior

    PubMed Central

    Hipwell, Alison E.; Vermeiren, Robert; Loeber, Rolf

    2012-01-01

    Little is known about the ways in which the accumulation of maternal factors increases or reduces risk for girls’ disruptive behavior during preadolescence. In the current study, maternal risk and promotive factors and the severity of girls’ disruptive behavior were assessed annually among girls’ ages 7–12 in an urban community sample (N=2043). Maternal risk and promotive factors were operative at different time points in girls’ development. Maternal warmth explained variance in girls’ disruptive behavior, even after controlling for maternal risk factors and relevant child and neighborhood factors. In addition, findings supported the cumulative hypothesis that the number of risk factors increased the chance on girls’ disruptive behavior disorder (DBD), while the number of promotive factors decreased this probability. Daughters of mothers with a history of Conduct Disorder (CD) were exposed to more risk factors and fewer promotive factors compared to daughters of mothers without prior CD. The identification of malleable maternal factors that can serve as targets for intervention has important implications for intergenerational intervention. Cumulative effects show that the focus of prevention efforts should not be on single factors, but on multiple factors associated with girls’ disruptive behavior. PMID:22127641

  2. Hunter-Gatherer Inter-Band Interaction Rates: Implications for Cumulative Culture

    PubMed Central

    Hill, Kim R.; Wood, Brian M.; Baggio, Jacopo; Hurtado, A. Magdalena; Boyd, Robert T.

    2014-01-01

    Our species exhibits spectacular success due to cumulative culture. While cognitive evolution of social learning mechanisms may be partially responsible for adaptive human culture, features of early human social structure may also play a role by increasing the number potential models from which to learn innovations. We present interview data on interactions between same-sex adult dyads of Ache and Hadza hunter-gatherers living in multiple distinct residential bands (20 Ache bands; 42 Hadza bands; 1201 dyads) throughout a tribal home range. Results show high probabilities (5%–29% per year) of cultural and cooperative interactions between randomly chosen adults. Multiple regression suggests that ritual relationships increase interaction rates more than kinship, and that affinal kin interact more often than dyads with no relationship. These may be important features of human sociality. Finally, yearly interaction rates along with survival data allow us to estimate expected lifetime partners for a variety of social activities, and compare those to chimpanzees. Hadza and Ache men are estimated to observe over 300 men making tools in a lifetime, whereas male chimpanzees interact with only about 20 other males in a lifetime. High intergroup interaction rates in ancestral humans may have promoted the evolution of cumulative culture. PMID:25047714

  3. Hunter-gatherer inter-band interaction rates: implications for cumulative culture.

    PubMed

    Hill, Kim R; Wood, Brian M; Baggio, Jacopo; Hurtado, A Magdalena; Boyd, Robert T

    2014-01-01

    Our species exhibits spectacular success due to cumulative culture. While cognitive evolution of social learning mechanisms may be partially responsible for adaptive human culture, features of early human social structure may also play a role by increasing the number potential models from which to learn innovations. We present interview data on interactions between same-sex adult dyads of Ache and Hadza hunter-gatherers living in multiple distinct residential bands (20 Ache bands; 42 Hadza bands; 1201 dyads) throughout a tribal home range. Results show high probabilities (5%-29% per year) of cultural and cooperative interactions between randomly chosen adults. Multiple regression suggests that ritual relationships increase interaction rates more than kinship, and that affinal kin interact more often than dyads with no relationship. These may be important features of human sociality. Finally, yearly interaction rates along with survival data allow us to estimate expected lifetime partners for a variety of social activities, and compare those to chimpanzees. Hadza and Ache men are estimated to observe over 300 men making tools in a lifetime, whereas male chimpanzees interact with only about 20 other males in a lifetime. High intergroup interaction rates in ancestral humans may have promoted the evolution of cumulative culture.

  4. Cumulative incidence of functional decline after minor injuries in previously independent older Canadian individuals in the emergency department.

    PubMed

    Sirois, Marie-Josée; Émond, Marcel; Ouellet, Marie-Christine; Perry, Jeffrey; Daoust, Raoul; Morin, Jacques; Dionne, Clermont; Camden, Stéphanie; Moore, Lynne; Allain-Boulé, Nadine

    2013-10-01

    To estimate the cumulative incidence of functional decline in independent older adults 3 and 6 months after a minor injury treated in the emergency department (ED) and to identify predictors of this functional decline. Prospective cohort study. Three Canadian teaching EDs. Individuals aged 65 and older who were independent in basic activities of daily living before their injury and were evaluated in the ED for minor injuries (N = 335). Functional decline was defined as a loss of 2 or more out of 28 points on the self-reported Older Americans Resources Services scale. Sociodemographic, mobility, and clinical risk factors for functional decline in non-ED studies were measured at the ED visit and 3 and 6 months after the injury. Generalized linear mixed models were used to explore differences in functional decline between groups determined according to the different factors. The cumulative incidence of decline was 14.9% (95% confidence interval (CI) = 7.6-29.1%) at 3 months and 17.3% (95% CI = 9.7-30.9%) at 6 months. Predictors of functional decline were occasional use of a walking aid (relative risk (RR)=2.4, 95% CI = 1.4-4.2), needing help in instrumental activities of daily living (IADLs) before the injury (RR = 3.1, 95% CI=1.7-5.5), taking five or more daily medications (RR = 1.8, 95% CI = 1.0-3.2), and the emergency physicians' assessment of functional decline (RR = 2.8, 95% CI = 1.5-5.3). Minor injuries in independent older adults treated in EDs are associated with a 15% cumulative incidence of functional decline 3 months after the injury that persisted 6 months later. Simple-to-measure factors such as occasional use of a walking aid, daily medication, need for help with IADLs, and physician assessment of decline may help identify independent older adults at risk of functional decline during their consultation. These results confirm the need to improve risk assessment and management of this population in EDs. © 2013, Copyright the Authors Journal compilation © 2013, The American Geriatrics Society.

  5. Ovarian Suppression With Triptorelin During Adjuvant Breast Cancer Chemotherapy and Long-term Ovarian Function, Pregnancies, and Disease-Free Survival: A Randomized Clinical Trial.

    PubMed

    Lambertini, Matteo; Boni, Luca; Michelotti, Andrea; Gamucci, Teresa; Scotto, Tiziana; Gori, Stefania; Giordano, Monica; Garrone, Ornella; Levaggi, Alessia; Poggio, Francesca; Giraudi, Sara; Bighin, Claudia; Vecchio, Carlo; Sertoli, Mario Roberto; Pronzato, Paolo; Del Mastro, Lucia

    Whether the administration of luteinizing hormone-releasing hormone analogues (LHRHa) during chemotherapy is a reliable strategy to preserve ovarian function is controversial owing to both the lack of data on long-term ovarian function and pregnancies and the safety concerns about the potential negative interactions between endocrine therapy and chemotherapy. To evaluate long-term results of LHRHa-induced ovarian suppression during breast cancer chemotherapy. Parallel, randomized, open-label, phase 3 superiority trial conducted at 16 Italian sites. Between October 2003 and January 2008, 281 premenopausal women with stage I to III hormone receptor-positive or hormone receptor-negative breast cancer were enrolled. Last annual follow-up was June 3, 2014. Patients were randomized to receive adjuvant or neoadjuvant chemotherapy alone (control group) or chemotherapy plus triptorelin (LHRHa group). The primary planned end point was incidence of chemotherapy-induced early menopause. Post hoc end points were long-term ovarian function (evaluated by yearly assessment of menstrual activity and defined as resumed by the occurrence of at least 1 menstrual cycle), pregnancies, and disease-free survival (DFS). A total of 281 women (median age, 39 [range, 24-45] years) were randomized. Median follow-up was 7.3 years (interquartile range, 6.3-8.2 years). The 5-year cumulative incidence estimate of menstrual resumption was 72.6% (95% CI, 65.7%-80.3%) among the 148 patients in the LHRHa group and 64.0% (95% CI, 56.2%-72.8%) among the 133 patients in the control group (hazard ratio [HR], 1.28 [95% CI, 0.98-1.68]; P = .07; age-adjusted HR, 1.48 [95% CI, 1.12-1.95]; P = .006). Eight pregnancies (5-year cumulative incidence estimate of pregnancy, 2.1% [95% CI, 0.7%-6.3%]) occurred in the LHRHa group and 3 (5-year cumulative incidence estimate of pregnancy, 1.6% [95% CI, 0.4%-6.2%]) in the control group (HR, 2.56 [95% CI, 0.68-9.60]; P = .14; age-adjusted HR, 2.40 [95% CI, 0.62-9.22]; P = .20). Five-year DFS was 80.5% (95% CI, 73.1%-86.1%) in the LHRHa group and 83.7% (95% CI, 76.1%-89.1%) in the control group (LHRHa vs control: HR, 1.17 [95% CI, 0.72-1.92]; P = .52). Among premenopausal women with either hormone receptor-positive or hormone receptor-negative breast cancer, concurrent administration of triptorelin and chemotherapy, compared with chemotherapy alone, was associated with higher long-term probability of ovarian function recovery, without a statistically significant difference in pregnancy rate. There was no statistically significant difference in DFS for women assigned to triptorelin and those assigned to chemotherapy alone, although study power was limited. clinicaltrials.gov Identifier:NCT00311636.

  6. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  7. Degradation data analysis based on a generalized Wiener process subject to measurement error

    NASA Astrophysics Data System (ADS)

    Li, Junxing; Wang, Zhihua; Zhang, Yongbo; Fu, Huimin; Liu, Chengrui; Krishnaswamy, Sridhar

    2017-09-01

    Wiener processes have received considerable attention in degradation modeling over the last two decades. In this paper, we propose a generalized Wiener process degradation model that takes unit-to-unit variation, time-correlated structure and measurement error into considerations simultaneously. The constructed methodology subsumes a series of models studied in the literature as limiting cases. A simple method is given to determine the transformed time scale forms of the Wiener process degradation model. Then model parameters can be estimated based on a maximum likelihood estimation (MLE) method. The cumulative distribution function (CDF) and the probability distribution function (PDF) of the Wiener process with measurement errors are given based on the concept of the first hitting time (FHT). The percentiles of performance degradation (PD) and failure time distribution (FTD) are also obtained. Finally, a comprehensive simulation study is accomplished to demonstrate the necessity of incorporating measurement errors in the degradation model and the efficiency of the proposed model. Two illustrative real applications involving the degradation of carbon-film resistors and the wear of sliding metal are given. The comparative results show that the constructed approach can derive a reasonable result and an enhanced inference precision.

  8. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  9. Temporary wetlands: Challenges and solutions to conserving a ‘disappearing’ ecosystem

    USGS Publications Warehouse

    Calhoun, Aram J.K.; Mushet, David M.; Bell, Kathleen P.; Boix, Dani; Fitzsimons, James A.; Isselin-Nondedeu, Francis

    2017-01-01

    Frequent drying of ponded water, and support of unique, highly specialized assemblages of often rare species, characterize temporary wetlands, such as vernal pools, gilgais, and prairie potholes. As small aquatic features embedded in a terrestrial landscape, temporary wetlands enhance biodiversity and provide aesthetic, biogeochemical, and hydrologic functions. Challenges to conserving temporary wetlands include the need to: (1) integrate freshwater and terrestrial biodiversity priorities; (2) conserve entire ‘pondscapes’ defined by connections to other aquatic and terrestrial systems; (3) maintain natural heterogeneity in environmental gradients across and within wetlands, especially gradients in hydroperiod; (4) address economic impact on landowners and developers; (5) act without complete inventories of these wetlands; and (6) work within limited or non-existent regulatory protections. Because temporary wetlands function as integral landscape components, not singly as isolated entities, their cumulative loss is ecologically detrimental yet not currently part of the conservation calculus. We highlight approaches that use strategies for conserving temporary wetlands in increasingly human-dominated landscapes that integrate top-down management and bottom-up collaborative approaches. Diverse conservation activities (including education, inventory, protection, sustainable management, and restoration) that reduce landowner and manager costs while achieving desired ecological objectives will have the greatest probability of success in meeting conservation goals.

  10. Diversity Performance Analysis on Multiple HAP Networks.

    PubMed

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-06-30

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  11. Disentangling rotational velocity distribution of stars

    NASA Astrophysics Data System (ADS)

    Curé, Michel; Rial, Diego F.; Cassetti, Julia; Christen, Alejandra

    2017-11-01

    Rotational speed is an important physical parameter of stars: knowing the distribution of stellar rotational velocities is essential for understanding stellar evolution. However, rotational speed cannot be measured directly and is instead the convolution between the rotational speed and the sine of the inclination angle vsin(i). The problem itself can be described via a Fredhoml integral of the first kind. A new method (Curé et al. 2014) to deconvolve this inverse problem and obtain the cumulative distribution function for stellar rotational velocities is based on the work of Chandrasekhar & Münch (1950). Another method to obtain the probability distribution function is Tikhonov regularization method (Christen et al. 2016). The proposed methods can be also applied to the mass ratio distribution of extrasolar planets and brown dwarfs (in binary systems, Curé et al. 2015). For stars in a cluster, where all members are gravitationally bounded, the standard assumption that rotational axes are uniform distributed over the sphere is questionable. On the basis of the proposed techniques a simple approach to model this anisotropy of rotational axes has been developed with the possibility to ``disentangling'' simultaneously both the rotational speed distribution and the orientation of rotational axes.

  12. Diffusion of non-Gaussianity in heavy ion collisions

    NASA Astrophysics Data System (ADS)

    Kitazawa, Masakiyo; Asakawa, Masayuki; Ono, Hirosato

    2014-05-01

    We investigate the time evolution of higher order cumulants of bulk fluctuations of conserved charges in the hadronic stage in relativistic heavy ion collisions. The dynamical evolution of non-Gaussian fluctuations is modeled by the diffusion master equation. Using this model we predict that the fourth-order cumulant of net-electric charge is suppressed compared with the recently observed second-order one at ALICE for a reasonable parameter range. Significance of the measurements of various cumulants as functions of rapidity window to probe dynamical history of the hot medium created by heavy ion collisions is emphasized.

  13. Volume dependence of baryon number cumulants and their ratios

    DOE PAGES

    Almási, Gábor A.; Pisarski, Robert D.; Skokov, Vladimir V.

    2017-03-17

    Here, we explore the influence of finite-volume effects on cumulants of baryon/quark number fluctuations in a nonperturbative chiral model. In order to account for soft modes, we use the functional renormalization group in a finite volume, using a smooth regulator function in momentum space. We compare the results for a smooth regulator with those for a sharp (or Litim) regulator, and show that in a finite volume, the latter produces spurious artifacts. In a finite volume there are only apparent critical points, about which we compute the ratio of the fourth- to the second-order cumulant of quark number fluctuations. Finally,more » when the volume is sufficiently small the system has two apparent critical points; as the system size decreases, the location of the apparent critical point can move to higher temperature and lower chemical potential.« less

  14. Revision of 'Cumulative effect of the filamentation and Weibel instabilities in counterstreaming thermal plasmas' [Phys. Plasmas 13, 102107 (2006)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stockem, A.; Lazar, M.; Department of Physics and Engineering Physics, University of Saskatchewan, Saskatoon

    2008-01-15

    Dispersion formalism reported in Lazar et al. [Phys. Plasmas 13, 102107 (2006)] is affected by errors due to the misfitting of the distribution function (1) used to interpret the counterstreaming plasmas, with the general dispersion relations (4) and (5), where distribution function (1) has been inserted to find the unstable solutions. The analytical approach is reviewed here, providing a correct analytical and numerical description for the cumulative effect of filamentation and Weibel instabilities arising in initially counterstreaming plasmas with temperature anisotropies. The growth rates are plotted again, and for the cumulative mode, they are orders of magnitude larger than thosemore » obtained in Lazar et al. [Phys. Plasmas 13, 102107 (2006)]. Physically, this can be understood as an increasing of the efficiency of magnetic field generation, and rather enhances the potential role of magnetic instabilities for the fast magnetization scenario in astrophysical applications.« less

  15. Spatial Interpolation of Rain-field Dynamic Time-Space Evolution in Hong Kong

    NASA Astrophysics Data System (ADS)

    Liu, P.; Tung, Y. K.

    2017-12-01

    Accurate and reliable measurement and prediction of spatial and temporal distribution of rain-field over a wide range of scales are important topics in hydrologic investigations. In this study, geostatistical treatment of precipitation field is adopted. To estimate the rainfall intensity over a study domain with the sample values and the spatial structure from the radar data, the cumulative distribution functions (CDFs) at all unsampled locations were estimated. Indicator Kriging (IK) was used to estimate the exceedance probabilities for different pre-selected cutoff levels and a procedure was implemented for interpolating CDF values between the thresholds that were derived from the IK. Different interpolation schemes of the CDF were proposed and their influences on the performance were also investigated. The performance measures and visual comparison between the observed rain-field and the IK-based estimation suggested that the proposed method can provide fine results of estimation of indicator variables and is capable of producing realistic image.

  16. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  17. 5-Fluorouracil:carnauba wax microspheres for chemoembolization: an in vitro evaluation.

    PubMed

    Benita, S; Zouai, O; Benoit, J P

    1986-09-01

    5-Fluorouracil:carnauba wax microspheres were prepared using a meltable dispersion process with the aid of a surfactant as a wetting agent. It was noted that only hydrophilic surfactants were able to wet the 5-fluorouracil and substantially increased its content in the microspheres. No marked effect was observed in the particle size distribution of the solid microspheres as a function of the nature of the surfactant. Increasing the stirring rate in the preparation process decreased, first, the mean droplet size of the emulsified melted dispersion in the vehicle during the heating process, and, consequently, the mean particle size of the solidified microspheres during the cooling process. 5-Fluorouracil cumulative release from the microspheres followed first-order kinetics, as shown by nonlinear regression analysis. Although the kinetic results were not indicative of the true release mechanism from a single microsphere, it was believed that 5-fluorouracil release from the microspheres was probably governed by a dissolution process, rather than by a leaching process through the carnauba wax microspheres.

  18. Field information links permafrost carbon to physical vulnerabilities of thawing

    USGS Publications Warehouse

    Harden, Jennifer W.; Koven, Charles; Ping, Chien-Lu; Hugelius, Gustaf; McGuire, A. David; Camill, P.; Jorgenson, Torre; Kuhry, Peter; Michaelson, Gary; O'Donnell, Jonathan A.; Schuur, Edward A.G.; Tamocai, Charles; Johnson, Kevin; Grosse, G.

    2012-01-01

    Deep soil profiles containing permafrost (Gelisols) were characterized for organic carbon (C) and total nitrogen (N) stocks to 3m depths. Using the Community Climate System Model (CCSM4) we calculate cumulative probability functions (PDFs) for active layer depths under current and future climates. The difference in PDFs over time was multiplied by C and N contents of soil horizons in Gelisol suborders to calculate newly thawed C and N, Thawing ranged from 147 PgC with 10 PgN by 2050 (representative concentration pathway RCP scenario 4.5) to 436 PgC with 29 PgN by 2100 (RCP 8.5). Organic horizons that thaw are vulnerable to combustion, and all horizon types are vulnerable to shifts in hydrology and decomposition. The rates and extent of such losses are unknown and can be further constrained by linking field and modelling approaches. These changes have the potential for strong additional loading to our atmosphere, water resources, and ecosystems.

  19. Photo-assisted electron emission from illuminated monolayer graphene

    NASA Astrophysics Data System (ADS)

    Upadhyay Kahaly, M.; Misra, Shikha; Mishra, S. K.

    2017-05-01

    We establish a formalism to address co-existing and complementing thermionic and photoelectric emission from a monolayer graphene sheet illuminated via monochromatic laser radiation and operating at a finite temperature. Taking into account the two dimensional Fermi-Dirac statistics as is applicable for a graphene sheet, the electron energy redistribution due to thermal agitation via laser irradiation, and Fowler's approach of the electron emission, along with Born's approximation to evaluate the tunneling probability, the expressions for the photoelectric and thermionic emission flux have been derived. The cumulative emission flux is observed to be sensitive to the parametric tuning of the laser and material specifications. Based on the parametric analysis, the photoemission flux is noticed to dominate over its coexisting counterpart thermionic emission flux for smaller values of the material work function, surface temperature, and laser wavelength; the analytical estimates are in reasonably good agreement with the recent experimental observations [Massicotte et al., Nat. Commun. 7, 12174 (2016)]. The results evince the efficient utilization of a graphene layer as a photo-thermionic emitter.

  20. Evaluation of an Ensemble Dispersion Calculation.

    NASA Astrophysics Data System (ADS)

    Draxler, Roland R.

    2003-02-01

    A Lagrangian transport and dispersion model was modified to generate multiple simulations from a single meteorological dataset. Each member of the simulation was computed by assuming a ±1-gridpoint shift in the horizontal direction and a ±250-m shift in the vertical direction of the particle position, with respect to the meteorological data. The configuration resulted in 27 ensemble members. Each member was assumed to have an equal probability. The model was tested by creating an ensemble of daily average air concentrations for 3 months at 75 measurement locations over the eastern half of the United States during the Across North America Tracer Experiment (ANATEX). Two generic graphical displays were developed to summarize the ensemble prediction and the resulting concentration probabilities for a specific event: a probability-exceed plot and a concentration-probability plot. Although a cumulative distribution of the ensemble probabilities compared favorably with the measurement data, the resulting distribution was not uniform. This result was attributed to release height sensitivity. The trajectory ensemble approach accounts for about 41%-47% of the variance in the measurement data. This residual uncertainty is caused by other model and data errors that are not included in the ensemble design.

  1. Social consequences of multiple sclerosis. Part 2. Divorce and separation: a historical prospective cohort study.

    PubMed

    Pfleger, C C H; Flachs, E M; Koch-Henriksen, Nils

    2010-07-01

    There is a need for follow-up studies of the familial situation of multiple sclerosis (MS) patients. To evaluate the probability of MS patients to remain in marriage or relationship with the same partner after onset of MS in comparison with the population. All 2538 Danes with onset of MS 1980-1989, retrieved from the Danish MS-Registry, and 50,760 matched and randomly drawn control persons were included. Information on family status was retrieved from Statistics Denmark. Cox analyses were used with onset as starting point. Five years after onset, the cumulative probability of remaining in the same relationship was 86% in patients vs. 89% in controls. The probabilities continued to deviate, and at 24 years, the probability was 33% in patients vs. 53% in the control persons (p < 0.001). Among patients with young onset (< 36 years of age), those with no children had a higher risk of divorce than those having children less than 7 years (Hazard Ratio 1.51; p < 0.0001), and men had a higher risk of divorce than women (Hazard Ratio 1.33; p < 0.01). MS significantly affects the probability of remaining in the same relationship compared with the background population.

  2. Modelling detectability of kiore (Rattus exulans) on Aguiguan, Mariana Islands, to inform possible eradication and monitoring efforts

    USGS Publications Warehouse

    Adams, A.A.Y.; Stanford, J.W.; Wiewel, A.S.; Rodda, G.H.

    2011-01-01

    Estimating the detection probability of introduced organisms during the pre-monitoring phase of an eradication effort can be extremely helpful in informing eradication and post-eradication monitoring efforts, but this step is rarely taken. We used data collected during 11 nights of mark-recapture sampling on Aguiguan, Mariana Islands, to estimate introduced kiore (Rattus exulans Peale) density and detection probability, and evaluated factors affecting detectability to help inform possible eradication efforts. Modelling of 62 captures of 48 individuals resulted in a model-averaged density estimate of 55 kiore/ha. Kiore detection probability was best explained by a model allowing neophobia to diminish linearly (i.e. capture probability increased linearly) until occasion 7, with additive effects of sex and cumulative rainfall over the prior 48 hours. Detection probability increased with increasing rainfall and females were up to three times more likely than males to be trapped. In this paper, we illustrate the type of information that can be obtained by modelling mark-recapture data collected during pre-eradication monitoring and discuss the potential of using these data to inform eradication and posteradication monitoring efforts. ?? New Zealand Ecological Society.

  3. Infant Parasympathetic and Sympathetic Activity during Baseline, Stress and Recovery: Interactions with Prenatal Adversity Predict Physical Aggression in Toddlerhood.

    PubMed

    Suurland, J; van der Heijden, K B; Huijbregts, S C J; van Goozen, S H M; Swaab, H

    2018-05-01

    Exposure to prenatal adversity is associated with aggression later in life. Individual differences in autonomic nervous system (ANS) functioning, specifically nonreciprocal activation of the parasympathetic (PNS) and sympathetic (SNS) nervous systems, increase susceptibility to aggression, especially in the context of adversity. Previous work examining interactions between early adversity and ANS functioning in infancy is scarce and has not examined interaction between PNS and SNS. This study examined whether the PNS and SNS moderate the relation between cumulative prenatal risk and early physical aggression in 124 children (57% male). Cumulative risk (e.g., maternal psychiatric disorder, substance (ab)use, and social adversity) was assessed during pregnancy. Parasympathetic respiratory sinus arrhythmia (RSA) and sympathetic pre-ejection period (PEP) at baseline, in response to and during recovery from emotional challenge were measured at 6 months. Physical aggression and non-physical aggression/oppositional behavior were measured at 30 months. The results showed that cumulative prenatal risk predicted elevated physical aggression and non-physical aggression/oppositional behavior in toddlerhood; however, the effects on physical aggression were moderated by PNS and SNS functioning. Specifically, the effects of cumulative risk on physical aggression were particularly evident in children characterized by low baseline PNS activity and/or by nonreciprocal activity of the PNS and SNS, characterized by decreased activity (i.e., coinhibition) or increased activity (i.e., coactivation) of both systems at baseline and/or in response to emotional challenge. These findings extend our understanding of the interaction between perinatal risk and infant ANS functioning on developmental outcome.

  4. The Effects of Framing, Reflection, Probability, and Payoff on Risk Preference in Choice Tasks.

    PubMed

    Kühberger; Schulte-Mecklenbeck; Perner

    1999-06-01

    A meta-analysis of Asian-disease-like studies is presented to identify the factors which determine risk preference. First the confoundings between probability levels, payoffs, and framing conditions are clarified in a task analysis. Then the role of framing, reflection, probability, type, and size of payoff is evaluated in a meta-analysis. It is shown that bidirectional framing effects exist for gains and for losses. Presenting outcomes as gains tends to induce risk aversion, while presenting outcomes as losses tends to induce risk seeking. Risk preference is also shown to depend on the size of the payoffs, on the probability levels, and on the type of good at stake (money/property vs human lives). In general, higher payoffs lead to increasing risk aversion. Higher probabilities lead to increasing risk aversion for gains and to increasing risk seeking for losses. These findings are confirmed by a subsequent empirical test. Shortcomings of existing formal theories, such as prospect theory, cumulative prospect theory, venture theory, and Markowitz's utility theory, are identified. It is shown that it is not probabilities or payoffs, but the framing condition, which explains most variance. These findings are interpreted as showing that no linear combination of formally relevant predictors is sufficient to capture the essence of the framing phenomenon. Copyright 1999 Academic Press.

  5. Estimating soil moisture exceedance probability from antecedent rainfall

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Kalansky, J.; Stock, J. D.; Collins, B. D.

    2016-12-01

    The first storms of the rainy season in coastal California, USA, add moisture to soils but rarely trigger landslides. Previous workers proposed that antecedent rainfall, the cumulative seasonal rain from October 1 onwards, had to exceed specific amounts in order to trigger landsliding. Recent monitoring of soil moisture upslope of historic landslides in the San Francisco Bay Area shows that storms can cause positive pressure heads once soil moisture values exceed a threshold of volumetric water content (VWC). We propose that antecedent rainfall could be used to estimate the probability that VWC exceeds this threshold. A major challenge to estimating the probability of exceedance is that rain gauge records are frequently incomplete. We developed a stochastic model to impute (infill) missing hourly precipitation data. This model uses nearest neighbor-based conditional resampling of the gauge record using data from nearby rain gauges. Using co-located VWC measurements, imputed data can be used to estimate the probability that VWC exceeds a specific threshold for a given antecedent rainfall. The stochastic imputation model can also provide an estimate of uncertainty in the exceedance probability curve. Here we demonstrate the method using soil moisture and precipitation data from several sites located throughout Northern California. Results show a significant variability between sites in the sensitivity of VWC exceedance probability to antecedent rainfall.

  6. The Geothermal Probabilistic Cost Model with an Application to a Geothermal Reservoir at Heber, California

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.

    1981-01-01

    A financial accounting model that incorporates physical and institutional uncertainties was developed for geothermal projects. Among the uncertainties it can handle are well depth, flow rate, fluid temperature, and permit and construction times. The outputs of the model are cumulative probability distributions of financial measures such as capital cost, levelized cost, and profit. These outputs are well suited for use in an investment decision incorporating risk. The model has the powerful feature that conditional probability distribution can be used to account for correlations among any of the input variables. The model has been applied to a geothermal reservoir at Heber, California, for a 45-MW binary electric plant. Under the assumptions made, the reservoir appears to be economically viable.

  7. Respiratory morbidity of pattern and model makers exposed to wood, plastic, and metal products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robins, T.G.; Haboubi, G.; Demers, R.Y.

    Pattern and model makers are skilled tradespersons who may be exposed to hardwoods, softwoods, phenol-formaldehyde resin-impregnated woods, epoxy and polyester/styrene resin systems, and welding and metal-casting fumes. The relationship of respiratory symptoms (wheezing, chronic bronchitis, dyspnea) and pulmonary function (FVC% predicted, FEV1% predicted, FEV1/FVC% predicted) with interview-derived cumulative exposure estimates to specific workplace agents and to all work with wood, plastic, or metal products was investigated in 751 pattern and model makers in southeast Michigan. In stratified analyses and age- and smoking-adjusted linear and logistic regression models, measures of cumulative wood exposures were associated with decrements in pulmonary function andmore » dyspnea, but not with other symptoms. In similar analyses, measures of cumulative plastic exposures were associated with wheezing, chronic bronchitis, and dyspnea, but not with decrements in pulmonary function. Prior studies of exposure levels among pattern and model makers and of respiratory health effects of specific agents among other occupational groups support the plausibility of wood-related effects more strongly than that of plastic-related effects.« less

  8. Predicting Academic Achievement from Cumulative Home Risk: The Mediating Roles of Effortful Control, Academic Relationships, and School Avoidance

    ERIC Educational Resources Information Center

    Swanson, Jodi; Valiente, Carlos; Lemery-Chalfant, Kathryn

    2012-01-01

    Components of the home environment are associated with children's academic functioning. The accumulation of risks in the home are expected to prove more detrimental to achievement than any one risk alone, but the processes accounting for this relation are unclear. Using an index of cumulative home risk (CHR) inclusive of protective factors, as…

  9. Risk identification and prediction of coal workers' pneumoconiosis in Kailuan Colliery Group in China: a historical cohort study.

    PubMed

    Shen, Fuhai; Yuan, Juxiang; Sun, Zhiqian; Hua, Zhengbing; Qin, Tianbang; Yao, Sanqiao; Fan, Xueyun; Chen, Weihong; Liu, Hongbo; Chen, Jie

    2013-01-01

    Prior to 1970, coal mining technology and prevention measures in China were poor. Mechanized coal mining equipment and advanced protection measures were continuously installed in the mines after 1970. All these improvements may have resulted in a change in the incidence of coal workers' pneumoconiosis (CWP). Therefore, it is important to identify the characteristics of CWP today and trends for the incidence of CWP in the future. A total of 17,023 coal workers from the Kailuan Colliery Group were studied. A life-table method was used to calculate the cumulative incidence rate of CWP and predict the number of new CWP patients in the future. The probability of developing CWP was estimated by a multilayer perceptron artificial neural network for each coal worker without CWP. The results showed that the cumulative incidence rates of CWP for tunneling, mining, combining, and helping workers were 31.8%, 27.5%, 24.2%, and 2.6%, respectively, during the same observation period of 40 years. It was estimated that there would be 844 new CWP cases among 16,185 coal workers without CWP within their life expectancy. There would be 273.1, 273.1, 227.6, and 69.9 new CWP patients in the next <10, 10-, 20-, and 30- years respectively in the study cohort within their life expectancy. It was identified that coal workers whose risk probabilities were over 0.2 were at high risk for CWP, and whose risk probabilities were under 0.1 were at low risk. The present and future incidence trends of CWP remain high among coal workers. We suggest that coal workers at high risk of CWP undergo a physical examination for pneumoconiosis every year, and the coal workers at low risk of CWP be examined every 5 years.

  10. Engraftment kinetics and graft failure after single umbilical cord blood transplantation using a myeloablative conditioning regimen.

    PubMed

    Ruggeri, Annalisa; Labopin, Myriam; Sormani, Maria Pia; Sanz, Guillermo; Sanz, Jaime; Volt, Fernanda; Michel, Gerard; Locatelli, Franco; Diaz De Heredia, Cristina; O'Brien, Tracey; Arcese, William; Iori, Anna Paola; Querol, Sergi; Kogler, Gesine; Lecchi, Lucilla; Pouthier, Fabienne; Garnier, Federico; Navarrete, Cristina; Baudoux, Etienne; Fernandes, Juliana; Kenzey, Chantal; Eapen, Mary; Gluckman, Eliane; Rocha, Vanderson; Saccardi, Riccardo

    2014-09-01

    Umbilical cord blood transplant recipients are exposed to an increased risk of graft failure, a complication leading to a higher rate of transplant-related mortality. The decision and timing to offer a second transplant after graft failure is challenging. With the aim of addressing this issue, we analyzed engraftment kinetics and outcomes of 1268 patients (73% children) with acute leukemia (64% acute lymphoblastic leukemia, 36% acute myeloid leukemia) in remission who underwent single-unit umbilical cord blood transplantation after a myeloablative conditioning regimen. The median follow-up was 31 months. The overall survival rate at 3 years was 47%; the 100-day cumulative incidence of transplant-related mortality was 16%. Longer time to engraftment was associated with increased transplant-related mortality and shorter overall survival. The cumulative incidence of neutrophil engraftment at day 60 was 86%, while the median time to achieve engraftment was 24 days. Probability density analysis showed that the likelihood of engraftment after umbilical cord blood transplantation increased after day 10, peaked on day 21 and slowly decreased to 21% by day 31. Beyond day 31, the probability of engraftment dropped rapidly, and the residual probability of engrafting after day 42 was 5%. Graft failure was reported in 166 patients, and 66 of them received a second graft (allogeneic, n=45). Rescue actions, such as the search for another graft, should be considered starting after day 21. A diagnosis of graft failure can be established in patients who have not achieved neutrophil recovery by day 42. Moreover, subsequent transplants should not be postponed after day 42. Copyright© Ferrata Storti Foundation.

  11. Role of HIV Infection Duration and CD4 Cell Level at Initiation of Combination Anti-Retroviral Therapy on Risk of Failure

    PubMed Central

    Lodi, Sara; Phillips, Andrew; Fidler, Sarah; Hawkins, David; Gilson, Richard; McLean, Ken; Fisher, Martin; Post, Frank; Johnson, Anne M.; Walker-Nthenda, Louise; Dunn, David; Porter, Kholoud

    2013-01-01

    Background The development of HIV drug resistance and subsequent virological failure are often cited as potential disadvantages of early cART initiation. However, their long-term probability is not known, and neither is the role of duration of infection at the time of initiation. Methods Patients enrolled in the UK Register of HIV seroconverters were followed-up from cART initiation to last HIV-RNA measurement. Through survival analysis we examined predictors of virologic failure (2HIV-RNA ≥400 c/l while on cART) including CD4 count and HIV duration at initiation. We also estimated the cumulative probabilities of failure and drug resistance (from the available HIV nucleotide sequences) for early initiators (cART within 12 months of seroconversion). Results Of 1075 starting cART at a median (IQR) CD4 count 272 (190,370) cells/mm3 and HIV duration 3 (1,6) years, virological failure occurred in 163 (15%). Higher CD4 count at initiation, but not HIV infection duration at cART initiation, was independently associated with lower risk of failure (p=0.033 and 0.592 respectively). Among 230 patients initiating cART early, 97 (42%) discontinued it after a median of 7 months; cumulative probabilities of resistance and failure by 8 years were 7% (95% CI 4,11) and 19% (13,25), respectively. Conclusion Although the rate of discontinuation of early cART in our cohort was high, the long-term rate of virological failure was low. Our data do not support early cART initiation being associated with increased risk of failure and drug resistance. PMID:24086588

  12. A zonation technique for landslide susceptibility in southern Taiwan

    NASA Astrophysics Data System (ADS)

    Chiang, Jie-Lun; Tian, Yu-Qing; Chen, Yie-Ruey; Tsai, Kuang-Jung

    2016-04-01

    In recent years, global climate changes violently, extreme rainfall events occur frequently and also cause massive sediment related disasters in Taiwan. The disaster seriously hit the regional economic development and national infrastructures. For example, in August, 2009, the typhoon Morakot brought massive rainfall especially in the mountains in Chiayi County and Kaohsiung County in which the cumulative maximum rainfall was up to 2900 mm; meanwhile, the cumulative maximum rainfall was over 1500m.m. in Nantou County, Tainan County and Pingtung County. The typhoon caused severe damage in southern Taiwan. The study will search for the influence on the sediment hazards caused by the extreme rainfall and hydrological environmental changes focusing on southern Taiwan (including Chiayi, Tainan, Kaohsiung and Pingtung). The instability index and kriging theories are applied to analyze the factors of landslide to determine the susceptibility in southern Taiwan. We collected the landslide records during the period year, 2007~2013 and analyzed the instability factors including elevation, slope, aspect, soil, and geology. Among these factors, slope got the highest weight. The steeper the slope is, the more the landslides occur. As for the factor of aspect, the highest probability falls on the Southwest. However, this factor has the lowest weight among all the factors. Likewise, Darkish colluvial soil holds the highest probability of collapses among all the soils. Miocene middle Ruifang group and its equivalents have the highest probability of collapses among all the geologies. In this study, Kriging was used to establish the susceptibility map in southern Taiwan. The instability index above 4.21 can correspond to those landslide records. The potential landslide area in southern Taiwan, where collapses more likely occur, belongs to high level and medium-high level; the area is 5.12% and 17.81% respectively.

  13. Measurement of higher cumulants of net-charge multiplicity distributions in Au +Au collisions at √{sN N}=7.7 -200 GeV

    NASA Astrophysics Data System (ADS)

    Adare, A.; Afanasiev, S.; Aidala, C.; Ajitanand, N. N.; Akiba, Y.; Akimoto, R.; Al-Bataineh, H.; Alexander, J.; Al-Ta'Ani, H.; Angerami, A.; Aoki, K.; Apadula, N.; Aramaki, Y.; Asano, H.; Aschenauer, E. C.; Atomssa, E. T.; Averbeck, R.; Awes, T. C.; Azmoun, B.; Babintsev, V.; Bai, M.; Baksay, G.; Baksay, L.; Bannier, B.; Barish, K. N.; Bassalleck, B.; Basye, A. T.; Bathe, S.; Baublis, V.; Baumann, C.; Baumgart, S.; Bazilevsky, A.; Belikov, S.; Belmont, R.; Bennett, R.; Berdnikov, A.; Berdnikov, Y.; Bickley, A. A.; Black, D.; Blau, D. S.; Bok, J. S.; Boyle, K.; Brooks, M. L.; Bryslawskyj, J.; Buesching, H.; Bumazhnov, V.; Bunce, G.; Butsyk, S.; Camacho, C. M.; Campbell, S.; Castera, P.; Chen, C.-H.; Chi, C. Y.; Chiu, M.; Choi, I. J.; Choi, J. B.; Choi, S.; Choudhury, R. K.; Christiansen, P.; Chujo, T.; Chung, P.; Chvala, O.; Cianciolo, V.; Citron, Z.; Cole, B. A.; Connors, M.; Constantin, P.; Cronin, N.; Crossette, N.; Csanád, M.; Csörgő, T.; Dahms, T.; Dairaku, S.; Danchev, I.; Das, K.; Datta, A.; Daugherity, M. S.; David, G.; Dehmelt, K.; Denisov, A.; Deshpande, A.; Desmond, E. J.; Dharmawardane, K. V.; Dietzsch, O.; Ding, L.; Dion, A.; Do, J. H.; Donadelli, M.; D'Orazio, L.; Drapier, O.; Drees, A.; Drees, K. A.; Durham, J. M.; Durum, A.; Dutta, D.; Edwards, S.; Efremenko, Y. V.; Ellinghaus, F.; Engelmore, T.; Enokizono, A.; En'yo, H.; Esumi, S.; Eyser, K. O.; Fadem, B.; Fields, D. E.; Finger, M.; Finger, M.; Fleuret, F.; Fokin, S. L.; Fraenkel, Z.; Frantz, J. E.; Franz, A.; Frawley, A. D.; Fujiwara, K.; Fukao, Y.; Fusayasu, T.; Gainey, K.; Gal, C.; Garg, P.; Garishvili, A.; Garishvili, I.; Giordano, F.; Glenn, A.; Gong, H.; Gong, X.; Gonin, M.; Goto, Y.; Granier de Cassagnac, R.; Grau, N.; Greene, S. V.; Grosse Perdekamp, M.; Gu, Y.; Gunji, T.; Guo, L.; Gustafsson, H.-Å.; Hachiya, T.; Haggerty, J. S.; Hahn, K. I.; Hamagaki, H.; Hamblen, J.; Han, R.; Hanks, J.; Hartouni, E. P.; Hashimoto, K.; Haslum, E.; Hayano, R.; Hayashi, S.; He, X.; Heffner, M.; Hemmick, T. K.; Hester, T.; Hill, J. C.; Hohlmann, M.; Hollis, R. S.; Holzmann, W.; Homma, K.; Hong, B.; Horaguchi, T.; Hori, Y.; Hornback, D.; Huang, S.; Ichihara, T.; Ichimiya, R.; Ide, J.; Iinuma, H.; Ikeda, Y.; Imai, K.; Imazu, Y.; Imrek, J.; Inaba, M.; Iordanova, A.; Isenhower, D.; Ishihara, M.; Isinhue, A.; Isobe, T.; Issah, M.; Isupov, A.; Ivanishchev, D.; Jacak, B. V.; Javani, M.; Jia, J.; Jiang, X.; Jin, J.; Johnson, B. M.; Joo, K. S.; Jouan, D.; Jumper, D. S.; Kajihara, F.; Kametani, S.; Kamihara, N.; Kamin, J.; Kaneti, S.; Kang, B. H.; Kang, J. H.; Kang, J. S.; Kapustinsky, J.; Karatsu, K.; Kasai, M.; Kawall, D.; Kawashima, M.; Kazantsev, A. V.; Kempel, T.; Key, J. A.; Khandai, P. K.; Khanzadeev, A.; Kijima, K. M.; Kim, B. I.; Kim, C.; Kim, D. H.; Kim, D. J.; Kim, E.; Kim, E.-J.; Kim, H. J.; Kim, K.-B.; Kim, S. H.; Kim, Y.-J.; Kim, Y. K.; Kinney, E.; Kiriluk, K.; Kiss, Á.; Kistenev, E.; Klatsky, J.; Kleinjan, D.; Kline, P.; Kochenda, L.; Komatsu, Y.; Komkov, B.; Konno, M.; Koster, J.; Kotchetkov, D.; Kotov, D.; Kozlov, A.; Král, A.; Kravitz, A.; Krizek, F.; Kunde, G. J.; Kurita, K.; Kurosawa, M.; Kwon, Y.; Kyle, G. S.; Lacey, R.; Lai, Y. S.; Lajoie, J. G.; Lebedev, A.; Lee, B.; Lee, D. M.; Lee, J.; Lee, K.; Lee, K. B.; Lee, K. S.; Lee, S. H.; Lee, S. R.; Leitch, M. J.; Leite, M. A. L.; Leitgab, M.; Leitner, E.; Lenzi, B.; Lewis, B.; Li, X.; Liebing, P.; Lim, S. H.; Linden Levy, L. A.; Liška, T.; Litvinenko, A.; Liu, H.; Liu, M. X.; Love, B.; Luechtenborg, R.; Lynch, D.; Maguire, C. F.; Makdisi, Y. I.; Makek, M.; Malakhov, A.; Malik, M. D.; Manion, A.; Manko, V. I.; Mannel, E.; Mao, Y.; Maruyama, T.; Masui, H.; Masumoto, S.; Matathias, F.; McCumber, M.; McGaughey, P. L.; McGlinchey, D.; McKinney, C.; Means, N.; Meles, A.; Mendoza, M.; Meredith, B.; Miake, Y.; Mibe, T.; Midori, J.; Mignerey, A. C.; Mikeš, P.; Miki, K.; Milov, A.; Mishra, D. K.; Mishra, M.; Mitchell, J. T.; Miyachi, Y.; Miyasaka, S.; Mohanty, A. K.; Mohapatra, S.; Moon, H. J.; Morino, Y.; Morreale, A.; Morrison, D. P.; Moskowitz, M.; Motschwiller, S.; Moukhanova, T. V.; Murakami, T.; Murata, J.; Mwai, A.; Nagae, T.; Nagamiya, S.; Nagle, J. L.; Naglis, M.; Nagy, M. I.; Nakagawa, I.; Nakamiya, Y.; Nakamura, K. R.; Nakamura, T.; Nakano, K.; Nattrass, C.; Nederlof, A.; Netrakanti, P. K.; Newby, J.; Nguyen, M.; Nihashi, M.; Niida, T.; Nouicer, R.; Novitzky, N.; Nukariya, A.; Nyanin, A. S.; Obayashi, H.; O'Brien, E.; Oda, S. X.; Ogilvie, C. A.; Oka, M.; Okada, K.; Onuki, Y.; Oskarsson, A.; Ouchida, M.; Ozawa, K.; Pak, R.; Pantuev, V.; Papavassiliou, V.; Park, B. H.; Park, I. H.; Park, J.; Park, S.; Park, S. K.; Park, W. J.; Pate, S. F.; Patel, L.; Pei, H.; Peng, J.-C.; Pereira, H.; Perepelitsa, D. V.; Peresedov, V.; Peressounko, D. Yu.; Petti, R.; Pinkenburg, C.; Pisani, R. P.; Proissl, M.; Purschke, M. L.; Purwar, A. K.; Qu, H.; Rak, J.; Rakotozafindrabe, A.; Ravinovich, I.; Read, K. F.; Reygers, K.; Reynolds, D.; Riabov, V.; Riabov, Y.; Richardson, E.; Riveli, N.; Roach, D.; Roche, G.; Rolnick, S. D.; Rosati, M.; Rosen, C. A.; Rosendahl, S. S. E.; Rosnet, P.; Rukoyatkin, P.; Ružička, P.; Ryu, M. S.; Sahlmueller, B.; Saito, N.; Sakaguchi, T.; Sakashita, K.; Sako, H.; Samsonov, V.; Sano, M.; Sano, S.; Sarsour, M.; Sato, S.; Sato, T.; Sawada, S.; Sedgwick, K.; Seele, J.; Seidl, R.; Semenov, A. Yu.; Sen, A.; Seto, R.; Sett, P.; Sharma, D.; Shein, I.; Shibata, T.-A.; Shigaki, K.; Shimomura, M.; Shoji, K.; Shukla, P.; Sickles, A.; Silva, C. L.; Silvermyr, D.; Silvestre, C.; Sim, K. S.; Singh, B. K.; Singh, C. P.; Singh, V.; Skolnik, M.; Slunečka, M.; Solano, S.; Soltz, R. A.; Sondheim, W. E.; Sorensen, S. P.; Sourikova, I. V.; Sparks, N. A.; Stankus, P. W.; Steinberg, P.; Stenlund, E.; Stepanov, M.; Ster, A.; Stoll, S. P.; Sugitate, T.; Sukhanov, A.; Sun, J.; Sziklai, J.; Takagui, E. M.; Takahara, A.; Taketani, A.; Tanabe, R.; Tanaka, Y.; Taneja, S.; Tanida, K.; Tannenbaum, M. J.; Tarafdar, S.; Taranenko, A.; Tarján, P.; Tennant, E.; Themann, H.; Thomas, T. L.; Todoroki, T.; Togawa, M.; Toia, A.; Tomášek, L.; Tomášek, M.; Torii, H.; Towell, R. S.; Tserruya, I.; Tsuchimoto, Y.; Tsuji, T.; Vale, C.; Valle, H.; van Hecke, H. W.; Vargyas, M.; Vazquez-Zambrano, E.; Veicht, A.; Velkovska, J.; Vértesi, R.; Vinogradov, A. A.; Virius, M.; Voas, B.; Vossen, A.; Vrba, V.; Vznuzdaev, E.; Wang, X. R.; Watanabe, D.; Watanabe, K.; Watanabe, Y.; Watanabe, Y. S.; Wei, F.; Wei, R.; Wessels, J.; Whitaker, S.; White, S. N.; Winter, D.; Wolin, S.; Wood, J. P.; Woody, C. L.; Wright, R. M.; Wysocki, M.; Xia, B.; Xie, W.; Yamaguchi, Y. L.; Yamaura, K.; Yang, R.; Yanovich, A.; Ying, J.; Yokkaichi, S.; You, Z.; Young, G. R.; Younus, I.; Yushmanov, I. E.; Zajc, W. A.; Zelenski, A.; Zhang, C.; Zhou, S.; Zolin, L.; Phenix Collaboration

    2016-01-01

    We report the measurement of cumulants (Cn,n =1 ,...,4 ) of the net-charge distributions measured within pseudorapidity (|η |<0.35 ) in Au +Au collisions at √{sNN}=7.7 -200 GeV with the PHENIX experiment at the Relativistic Heavy Ion Collider. The ratios of cumulants (e.g., C1/C2 , C3/C1 ) of the net-charge distributions, which can be related to volume independent susceptibility ratios, are studied as a function of centrality and energy. These quantities are important to understand the quantum-chromodynamics phase diagram and possible existence of a critical end point. The measured values are very well described by expectation from negative binomial distributions. We do not observe any nonmonotonic behavior in the ratios of the cumulants as a function of collision energy. The measured values of C1/C2 and C3/C1 can be directly compared to lattice quantum-chromodynamics calculations and thus allow extraction of both the chemical freeze-out temperature and the baryon chemical potential at each center-of-mass energy. The extracted baryon chemical potentials are in excellent agreement with a thermal-statistical analysis model.

  14. Individual differences in the activity of the hypothalamic pituitary adrenal axis: Relations to age and cumulative risk in early childhood.

    PubMed

    Holochwost, Steven J; Gariépy, Jean-Louis; Mills-Koonce, W Roger; Propper, Cathi B; Kolacz, Jacek; Granger, Douglas A

    2017-07-01

    This study examined individual differences in the function of the hypothalamic-pituitary-adrenal (HPA) axis with regard to age and cumulative risk during challenging laboratory tasks administered at 6, 12, 24, and 36 months. Saliva samples were collected from a majority-minority sample of N=185 children (57% African American, 50% female) prior to and following these tasks and later assayed for cortisol. Cumulative distal risk was indexed via a composite of maternal marital status, maternal education, income-to-needs ratio, the number of children in the household, and maternal age at childbirth. Probing of hierarchical models in which cortisol levels and age were nested within child revealed significant differences in cortisol as a function of both age and cumulative risk, such that children exposed to high levels of risk exhibited higher levels of cortisol both within and across age. These results highlight the sensitivity of the HPA axis to environmental context at the level of the individual, even as that sensitivity is manifest against the background of species-typical biological development. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Non-Gaussian elliptic-flow fluctuations in PbPb collisions at $$\\sqrt{\\smash[b]{s_{_\\text{NN}}}} = 5.02$$ TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sirunyan, Albert M; et al.

    Event-by-event fluctuations in the elliptic-flow coefficientmore » $$v_2$$ are studied in PbPb collisions at $$\\sqrt{s_{_\\text{NN}}} = 5.02$$ TeV using the CMS detector at the CERN LHC. Elliptic-flow probability distributions $${p}(v_2)$$ for charged particles with transverse momentum 0.3$$< p_\\mathrm{T} <$$3.0 GeV and pseudorapidity $$| \\eta | <$$ 1.0 are determined for different collision centrality classes. The moments of the $${p}(v_2)$$ distributions are used to calculate the $$v_{2}$$ coefficients based on cumulant orders 2, 4, 6, and 8. A rank ordering of the higher-order cumulant results and nonzero standardized skewness values obtained for the $${p}(v_2)$$ distributions indicate non-Gaussian initial-state fluctuation behavior. Bessel-Gaussian and elliptic power fits to the flow distributions are studied to characterize the initial-state spatial anisotropy.« less

  16. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  17. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  18. Turtle Bayou - 1936 to 1983: case history of a major gas field in south Louisiana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cronquist, C.

    1983-01-01

    Turtle Bayou field, located in the middle Miocene trend in S. Louisiana, is nearing the end of a productive life which spans over 30 yr. Discovered by Shell Oil Co. in 1949 after unsuccessful attempts by 2 other majors, the field is a typical, low relief, moderately faulted Gulf Coast structure, probably associated with deep salt movement. The productive interval includes 22 separate gas-bearing sands in a regressive sequence of sands and shales from approx. 6500 to 12,000 ft. Now estimated to have contained ca 1.2 trillion scf of gas in place, cumulative production through 1982 was 702 billion scf.more » Cumulative condensate-gas ratio has been 20 bbl/million. Recovery mechanisms in individual reservoirs include strong bottom water drive, partial edgewater drive, and pressure depletion. Recovery efficiencies in major reservoirs range from 40 to 75% of original gas in place.« less

  19. Study of the respiratory health of employees in seven European plants that manufacture ceramic fibres.

    PubMed Central

    Trethowan, W N; Burge, P S; Rossiter, C E; Harrington, J M; Calvert, I A

    1995-01-01

    OBJECTIVES--To study the relation between occupational exposure to ceramic fibres during manufacture and respiratory health. METHODS--The respiratory health of 628 current employees in the manufacture of ceramic fibres in seven European plants in three countries was studied with a respiratory questionnaire, lung function tests, and chest radiography. Simultaneous plant hygiene surveys measured subjects' current exposure to airborne ceramic fibres from personal samples with optical microscopy fibre counts. The measured exposures were combined with occupational histories to derive estimates of each subject's cumulative exposure to respirable fibres. Symptoms were related to current and cumulative exposure to ceramic fibres and lung function and findings from chest radiographs were related to cumulative exposure. RESULTS--The mean duration of employment was 10.2 years and mean (range) cumulative exposure was 3.84 (0-22.94) (f.ml-1.y). Eye and skin symptoms were frequent in all plants and increased significantly, as did breathlessness and wheeze, with increasing current exposure. Dry cough and stuffy nose were less common in the least exposed group but did not increase with increasing exposure. After adjustment for the effects of age, sex, height, smoking, and past occupational exposures to respiratory hazards, there was a significant decrease in both forced expiratory volume in one second (FEV1) and forced midexpiratory flow related to cumulative exposure in current smokers (P < 0.05) and in FEV1 in ex-smokers (P < 0.05). Small opacities were found in 13% of the chest radiographs; their prevalence was not related to cumulative exposure to ceramic fibres. CONCLUSIONS--It is concluded that exposure to ceramic fibres is associated with irritant symptoms similar to those seen in other exposures to man made mineral fibres (MMMFs) and that cumulative exposure to respirable ceramic fibres may cause airways obstruction by promoting the effects of cigarette smoke. PMID:7757174

  20. Study of the respiratory health of employees in seven European plants that manufacture ceramic fibres.

    PubMed

    Trethowan, W N; Burge, P S; Rossiter, C E; Harrington, J M; Calvert, I A

    1995-02-01

    To study the relation between occupational exposure to ceramic fibres during manufacture and respiratory health. The respiratory health of 628 current employees in the manufacture of ceramic fibres in seven European plants in three countries was studied with a respiratory questionnaire, lung function tests, and chest radiography. Simultaneous plant hygiene surveys measured subjects' current exposure to airborne ceramic fibres from personal samples with optical microscopy fibre counts. The measured exposures were combined with occupational histories to derive estimates of each subject's cumulative exposure to respirable fibres. Symptoms were related to current and cumulative exposure to ceramic fibres and lung function and findings from chest radiographs were related to cumulative exposure. The mean duration of employment was 10.2 years and mean (range) cumulative exposure was 3.84 (0-22.94) (f.ml-1.y). Eye and skin symptoms were frequent in all plants and increased significantly, as did breathlessness and wheeze, with increasing current exposure. Dry cough and stuffy nose were less common in the least exposed group but did not increase with increasing exposure. After adjustment for the effects of age, sex, height, smoking, and past occupational exposures to respiratory hazards, there was a significant decrease in both forced expiratory volume in one second (FEV1) and forced midexpiratory flow related to cumulative exposure in current smokers (P < 0.05) and in FEV1 in ex-smokers (P < 0.05). Small opacities were found in 13% of the chest radiographs; their prevalence was not related to cumulative exposure to ceramic fibres. It is concluded that exposure to ceramic fibres is associated with irritant symptoms similar to those seen in other exposures to man made mineral fibres (MMMFs) and that cumulative exposure to respirable ceramic fibres may cause airways obstruction by promoting the effects of cigarette smoke.

  1. Launch pad lightning protection effectiveness

    NASA Technical Reports Server (NTRS)

    Stahmann, James R.

    1991-01-01

    Using the striking distance theory that lightning leaders will strike the nearest grounded point on their last jump to earth corresponding to the striking distance, the probability of striking a point on a structure in the presence of other points can be estimated. The lightning strokes are divided into deciles having an average peak current and striking distance. The striking distances are used as radii from the points to generate windows of approach through which the leader must pass to reach a designated point. The projections of the windows on a horizontal plane as they are rotated through all possible angles of approach define an area that can be multiplied by the decile stroke density to arrive at the probability of strokes with the window average striking distance. The sum of all decile probabilities gives the cumulative probability for all strokes. The techniques can be applied to NASA-Kennedy launch pad structures to estimate the lightning protection effectiveness for the crane, gaseous oxygen vent arm, and other points. Streamers from sharp points on the structure provide protection for surfaces having large radii of curvature. The effects of nearby structures can also be estimated.

  2. Quantum Dynamics Study of the Isotopic Effect on Capture Reactions: HD, D2 + CH3

    NASA Technical Reports Server (NTRS)

    Wang, Dunyou; Kwak, Dochan (Technical Monitor)

    2002-01-01

    Time-dependent wave-packet-propagation calculations are reported for the isotopic reactions, HD + CH3 and D2 + CH3, in six degrees of freedom and for zero total angular momentum. Initial state selected reaction probabilities for different initial rotational-vibrational states are presented in this study. This study shows that excitations of the HD(D2) enhances the reactivities; whereas the excitations of the CH3 umbrella mode have the opposite effects. This is consistent with the reaction of H2 + CH3. The comparison of these three isotopic reactions also shows the isotopic effects in the initial-state-selected reaction probabilities. The cumulative reaction probabilities (CRP) are obtained by summing over initial-state-selected reaction probabilities. The energy-shift approximation to account for the contribution of degrees of freedom missing in the six dimensionality calculation is employed to obtain approximate full-dimensional CRPs. The rate constant comparison shows H2 + CH3 reaction has the biggest reactivity, then HD + CH3, and D2 + CH3 has the smallest.

  3. Understanding the gendered patterns of substance use initiation among adolescents living in rural, central Mexico

    PubMed Central

    Ayers, Stephanie; Marsiglia, Flavio; Hoffman, Steven; Urbaeva, Zhyldyz

    2012-01-01

    Background Little is known about the age of initiation and gender differences in substance use among adolescents in rural, central Mexico. Methods The cross-sectional data were collected from students enrolled in the Videobachillerato (VIBA) (video high school) program in Guanajuato, Mexico. Questionnaires asked students about the age at which they had used alcohol, cigarettes, or marijuana for the first time. Kaplan-Meier Survival Functions were used to estimate if males and females were significantly different in their cumulative probabilities of initiating substances over time. Results On average, alcohol is initiated at 14.7 years of age, cigarettes at 15.1 years of age, and marijuana at 16.5 years of age. Over time, males had a significantly higher probability of initiating alcohol (Kaplan-Meier Failure Curve: Χ2=26.35, p<0.001), cigarettes (Kaplan-Meier Failure Curve: Χ2=41.90, p<0.001), and marijuana (Kaplan-Meier Failure Curve: Χ2=38.01, p<0.001) compared to females. Conclusions These results highlight the gendered patterns of substance use initiation among adolescents in rural, central Mexico and underscore the need for gendered substance use prevention interventions with these adolescents. By putting forth efforts to understand substance use initiation patterns of adolescents living in rural, central Mexico, culturally specific and efficacious prevention efforts can be tailor-made to create lasting differences. PMID:22421555

  4. Significance of the zero sum principle for circadian, homeostatic and allostatic regulation of sleep-wake state in the rat.

    PubMed

    Stephenson, Richard; Caron, Aimee M; Famina, Svetlana

    2016-12-01

    Sleep-wake behavior exhibits diurnal rhythmicity, rebound responses to acute total sleep deprivation (TSD), and attenuated rebounds following chronic sleep restriction (CSR). We investigated how these long-term patterns of behavior emerge from stochastic short-term dynamics of state transition. Male Sprague-Dawley rats were subjected to TSD (1day×24h, N=9), or CSR (10days×18h TSD, N=7) using a rodent walking-wheel apparatus. One baseline day and one recovery day following TSD and CSR were analyzed. The implications of the zero sum principle were evaluated using a Markov model of sleep-wake state transition. Wake bout duration (a combined function of the probability of wake maintenance and proportional representations of brief and long wake) was a key variable mediating the baseline diurnal rhythms and post-TSD responses of all three states, and the attenuation of the post-CSR rebounds. Post-NREM state transition trajectory was an important factor in REM rebounds. The zero sum constraint ensures that a change in any transition probability always affects bout frequency and cumulative time of at least two, and usually all three, of wakefulness, NREM and REM. Neural mechanisms controlling wake maintenance may play a pivotal role in regulation and dysregulation of all three states. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Turbulent flame spreading mechanisms after spark ignition

    NASA Astrophysics Data System (ADS)

    Subramanian, V.; Domingo, Pascale; Vervisch, Luc

    2009-12-01

    Numerical simulation of forced ignition is performed in the framework of Large-Eddy Simulation (LES) combined with a tabulated detailed chemistry approach. The objective is to reproduce the flame properties observed in a recent experimental work reporting probability of ignition in a laboratory-scale burner operating with Methane/air non premixed mixture [1]. The smallest scales of chemical phenomena, which are unresolved by the LES grid, are approximated with a flamelet model combined with presumed probability density functions, to account for the unresolved part of turbulent fluctuations of species and temperature. Mono-dimensional flamelets are simulated using GRI-3.0 [2] and tabulated under a set of parameters describing the local mixing and progress of reaction. A non reacting case was simulated at first, to study the unsteady velocity and mixture fields. The time averaged velocity and mixture fraction, and their respective turbulent fluctuations, are compared against the experimental measurements, in order to estimate the prediction capabilities of LES. The time history of axial and radial components of velocity and mixture fraction is cumulated and analysed for different burner regimes. Based on this information, spark ignition is mimicked on selected ignition spots and the dynamics of kernel development analyzed to be compared against the experimental observations. The possible link between the success or failure of the ignition and the flow conditions (in terms of velocity and composition) at the sparking time are then explored.

  6. Separation of the low-frequency atmospheric variability into non-Gaussian multidimensional sources by Independent Subspace Analysis

    NASA Astrophysics Data System (ADS)

    Pires, Carlos; Ribeiro, Andreia

    2016-04-01

    An efficient nonlinear method of statistical source separation of space-distributed non-Gaussian distributed data is proposed. The method relies in the so called Independent Subspace Analysis (ISA), being tested on a long time-series of the stream-function field of an atmospheric quasi-geostrophic 3-level model (QG3) simulating the winter's monthly variability of the Northern Hemisphere. ISA generalizes the Independent Component Analysis (ICA) by looking for multidimensional and minimally dependent, uncorrelated and non-Gaussian distributed statistical sources among the rotated projections or subspaces of the multivariate probability distribution of the leading principal components of the working field whereas ICA restrict to scalar sources. The rationale of that technique relies upon the projection pursuit technique, looking for data projections of enhanced interest. In order to accomplish the decomposition, we maximize measures of the sources' non-Gaussianity by contrast functions which are given by squares of nonlinear, cross-cumulant-based correlations involving the variables spanning the sources. Therefore sources are sought matching certain nonlinear data structures. The maximized contrast function is built in such a way that it provides the minimization of the mean square of the residuals of certain nonlinear regressions. The issuing residuals, followed by spherization, provide a new set of nonlinear variable changes that are at once uncorrelated, quasi-independent and quasi-Gaussian, representing an advantage with respect to the Independent Components (scalar sources) obtained by ICA where the non-Gaussianity is concentrated into the non-Gaussian scalar sources. The new scalar sources obtained by the above process encompass the attractor's curvature thus providing improved nonlinear model indices of the low-frequency atmospheric variability which is useful since large circulation indices are nonlinearly correlated. The non-Gaussian tested sources (dyads and triads, respectively of two and three dimensions) lead to a dense data concentration along certain curves or surfaces, nearby which the clusters' centroids of the joint probability density function tend to be located. That favors a better splitting of the QG3 atmospheric model's weather regimes: the positive and negative phases of the Arctic Oscillation and positive and negative phases of the North Atlantic Oscillation. The leading model's non-Gaussian dyad is associated to a positive correlation between: 1) the squared anomaly of the extratropical jet-stream and 2) the meridional jet-stream meandering. Triadic sources coming from maximized third-order cross cumulants between pairwise uncorrelated components reveal situations of triadic wave resonance and nonlinear triadic teleconnections, only possible thanks to joint non-Gaussianity. That kind of triadic synergies are accounted for an Information-Theoretic measure: the Interaction Information. The dominant model's triad occurs between anomalies of: 1) the North Pole anomaly pressure 2) the jet-stream intensity at the Eastern North-American boundary and 3) the jet-stream intensity at the Eastern Asian boundary. Publication supported by project FCT UID/GEO/50019/2013 - Instituto Dom Luiz.

  7. Evaluation of NOAA's High Resolution Rapid Refresh (HRRR), 12 km North America Model (NAM12) and 4km North America Model (NAM 4) hub-height wind speed forecasts

    NASA Astrophysics Data System (ADS)

    Pendergrass, W.; Vogel, C. A.

    2013-12-01

    As an outcome of discussions between Duke Energy Generation and NOAA/ARL following the 2009 AMS Summer Community Meeting, in Norman Oklahoma, ARL and Duke Energy Generation (Duke) signed a Cooperative Research and Development Agreement (CRADA) which allows NOAA to conduct atmospheric boundary layer (ABL) research using Duke renewable energy sites as research testbeds. One aspect of this research has been the evaluation of forecast hub-height winds from three NOAA atmospheric models. Forecasts of 10m (surface) and 80m (hub-height) wind speeds from (1) NOAA/GSD's High Resolution Rapid Refresh (HRRR) model, (2) NOAA/NCEP's 12 km North America Model (NAM12) and (3) NOAA/NCEP's 4k high resolution North America Model (NAM4) were evaluated against 18 months of surface-layer wind observations collected at the joint NOAA/Duke Energy research station located at Duke Energy's West Texas Ocotillo wind farm over the period April 2011 through October 2012. HRRR, NAM12 and NAM4 10m wind speed forecasts were compared with 10m level wind speed observations measured on the NOAA/ATDD flux-tower. Hub-height (80m) HRRR , NAM12 and NAM4 forecast wind speeds were evaluated against the 80m operational PMM27-28 meteorological tower supporting the Ocotillo wind farm. For each HRRR update, eight forecast hours (hour 01, 02, 03, 05, 07, 10, 12, 15) plus the initialization hour (hour 00), evaluated. For the NAM12 and NAM4 models forecast hours 00-24 from the 06z initialization were evaluated. Performance measures or skill score based on absolute error 50% cumulative probability were calculated for each forecast hour. HRRR forecast hour 01 provided the best skill score with an absolute wind speed error within 0.8 m/s of observed 10m wind speed and 1.25 m/s for hub-height wind speed at the designated 50% cumulative probability. For both NAM4 and NAM12 models, skill scores were diurnal with comparable best scores observed during the day of 0.7 m/s of observed 10m wind speed and 1.1 m/s for hub-height wind speed at the designated 50% cumulative probability level.

  8. The long-term changes in total ozone, as derived from Dobson measurements at Arosa (1948-2001)

    NASA Astrophysics Data System (ADS)

    Krzyscin, J. W.

    2003-04-01

    The longest possible total ozone time series (Arosa, Switzerland) is examined for a detection of trends. Two-step procedure is proposed to estimate the long-term (decadal) variations in the ozone time series. The first step consists of a standard least-squares multiple regression applied to the total ozone monthly means to parameterize "natural" (related to the oscillations in the atmospheric dynamics) variations in the analyzed time series. The standard proxies for the dynamical ozone variations are used including; the 11-year solar activity cycle, and indices of QBO, ENSO and NAO. We use the detrended time series of temperature at 100 hPa and 500 hPa over Arosa to parameterize short-term variations (with time periods<1 year) in total ozone related to local changes in the meteorological conditions over the station. The second step consists of a smooth-curve fitting to the total ozone residuals (original minus modeled "natural" time series), the time derivation applied to this curve to obtain local trends, and bootstrapping of the residual time series to estimate the standard error of local trends. Locally weighted regression and the wavelet analysis methodology are used to extract the smooth component out of the residual time series. The time integral over the local trend values provides the cumulative long-term change since the data beginning. Examining the pattern of the cumulative change we see the periods with total ozone loss (the end of 50s up to early 60s - probably the effect of the nuclear bomb tests), recovery (mid 60s up to beginning of 70s), apparent decrease (beginning of 70s lasting to mid 90s - probably the effect of the atmosphere contamination by anthropogenic substances containing chlorine), and with a kind of stabilization or recovery (starting in the mid of 90s - probably the effect of the Montreal protocol to eliminate substances reducing the ozone layer). We can also estimate that a full ozone recovery (return to the undisturbed total ozone level from the beginning of 70s) is expected around 2050. We propose to calculate both time series of local trends and the cumulative long-term change instead single trend value derived as a slope of straight line fit to the data.

  9. Exploratory analysis of the effect of intravitreal ranibizumab or triamcinolone on worsening of diabetic retinopathy in a randomized clinical trial.

    PubMed

    Bressler, Susan B; Qin, Haijing; Melia, Michele; Bressler, Neil M; Beck, Roy W; Chan, Clement K; Grover, Sandeep; Miller, David G

    2013-08-01

    The standard care for proliferative diabetic retinopathy (PDR) usually is panretinal photocoagulation, an inherently destructive treatment that can cause iatrogenic vision loss. Therefore, evaluating the effects of therapies for diabetic macular edema on development or worsening of PDR might lead to new therapies for PDR. To evaluate the effects of intravitreal ranibizumab or triamcinolone acetonide, administered to treat diabetic macular edema, on worsening of diabetic retinopathy. Exploratory analysis was performed on worsening of retinopathy, defined as 1 or more of the following: (1) worsening from no PDR to PDR, (2) worsening of 2 or more severity levels on reading center assessment of fundus photographs in eyes without PDR at baseline, (3) having panretinal photocoagulation, (4) experiencing vitreous hemorrhage, or (5) undergoing vitrectomy for the treatment of PDR. Community- and university-based ophthalmology practices. Individuals with central-involved diabetic macular edema causing visual acuity impairment. Eyes were assigned randomly to sham with prompt focal/grid laser, 0.5 mg of intravitreal ranibizumab with prompt or deferred (≥24 weeks) laser, or 4 mg of intravitreal triamcinolone acetonide with prompt laser. Three-year cumulative probabilities for retinopathy worsening. For eyes without PDR at baseline, the 3-year cumulative probabilities for retinopathy worsening (P value comparison with sham with prompt laser) were 23% using sham with prompt laser, 18% with ranibizumab with prompt laser (P = .25), 7% with ranibizumab with deferred laser (P = .001), and 37% with triamcinolone with prompt laser (P = .10). For eyes with PDR at baseline, the 3-year cumulative probabilities for retinopathy worsening were 40%, 21% (P = .05), 18% (P = .02), and 12% (P < .001), respectively. CONCLUSIONS AND RELEVANCE Intravitreal ranibizumab appears to be associated with a reduced risk of diabetic retinopathy worsening in eyes with or without PDR. Intravitreal triamcinolone also appears to be associated with a reduced risk of PDR worsening. These findings suggest that use of these drugs to prevent worsening of diabetic retinopathy may be feasible. Given the exploratory nature of these analyses, the risk of endophthalmitis following intravitreal injections, and the fact that intravitreal triamcinolone can cause cataract or glaucoma, use of these treatments to reduce the rates of worsening of retinopathy, with or without PDR, does not seem warranted at this time.

  10. Cumulative stress and autonomic dysregulation in a community sample.

    PubMed

    Lampert, Rachel; Tuit, Keri; Hong, Kwang-Ik; Donovan, Theresa; Lee, Forrester; Sinha, Rajita

    2016-05-01

    Whether cumulative stress, including both chronic stress and adverse life events, is associated with decreased heart rate variability (HRV), a non-invasive measure of autonomic status which predicts poor cardiovascular outcomes, is unknown. Healthy community dwelling volunteers (N = 157, mean age 29 years) participated in the Cumulative Stress/Adversity Interview (CAI), a 140-item event interview measuring cumulative adversity including major life events, life trauma, recent life events and chronic stressors, and underwent 24-h ambulatory ECG monitoring. HRV was analyzed in the frequency domain and standard deviation of NN intervals (SDNN) calculated. Initial simple regression analyses revealed that total cumulative stress score, chronic stressors and cumulative adverse life events (CALE) were all inversely associated with ultra low-frequency (ULF), very low-frequency (VLF) and low-frequency (LF) power and SDNN (all p < 0.05). In hierarchical regression analyses, total cumulative stress and chronic stress each was significantly associated with SDNN and ULF even after the highly significant contributions of age and sex, with no other covariates accounting for additional appreciable variance. For VLF and LF, both total cumulative stress and chronic stress significantly contributed to the variance alone but were not longer significant after adjusting for race and health behaviors. In summary, total cumulative stress, and its components of adverse life events and chronic stress were associated with decreased cardiac autonomic function as measured by HRV. Findings suggest one potential mechanism by which stress may exert adverse effects on mortality in healthy individuals. Primary preventive strategies including stress management may prove beneficial.

  11. Cumulative stress and autonomic dysregulation in a community sample

    PubMed Central

    Lampert, Rachel; Tuit, Keri; Hong, Kwang-ik; Donovan, Theresa; Lee, Forrester; Sinha, Rajita

    2016-01-01

    Whether cumulative stress, including both chronic stress and adverse life events, is associated with decreased heart rate variability (HRV), a non-invasive measure of autonomic status which predicts poor cardiovascular outcomes, is unknown. Healthy community dwelling volunteers, (N= 157, mean age 29 years) participated in the Cumulative Stress/Adversity Interview, (CAI) a 140-item event interview measuring cumulative adversity including major life events, life trauma, recent life events and chronic stressors, and underwent 24 hour ambulatory ECG monitoring. HRV was analyzed in the frequency domain and standard deviation of NN intervals (SDNN) calculated. Initial simple regression analyses revealed that total cumulative stress score, chronic stressors, and cumulative adverse life events (CALE) were all inversely associated with ultra low frequency (ULF), very low frequency (VLF), and low frequency (LF) power and SDNN (all p<0.05). In hierarchical regression analyses, total cumulative stress and chronic stress each was significantly associated with SDNN and ULF even after the high significant contribution of age and sex, with no other covariates accounting for additional appreciable variance. For VLF and LF, both total cumulative stress and chronic stress significantly contributed to the variance were no longer significant after adjusting for race and health behaviors. (p’s<.05). In summary, total cumulative stress, and its components of adverse life events and chronic stress were associated with decreased cardiac autonomic function as measured by HRV. Findings suggest one potential mechanism by which stress may exert adverse effects on mortality in healthy individuals. Primary preventive strategies including stress management may prove beneficial. PMID:27112063

  12. Late-career unemployment and all-cause mortality, functional disability and depression among the older adults in Taiwan: A 12-year population-based cohort study.

    PubMed

    Chu, Wei-Min; Liao, Wen-Chun; Li, Chi-Rong; Lee, Shu-Hsin; Tang, Yih-Jing; Ho, Hsin-En; Lee, Meng-Chih

    2016-01-01

    To evaluate whether late-career unemployment is associated with increased all-cause mortality, functional disability, and depression among older adults in Taiwan. In this long-term prospective cohort study, data were retrieved from the Taiwan Longitudinal Study on Aging. This study was conducted from 1996 to 2007. The complete data from 716 men and 327 women aged 50-64 years were retrieved. Participants were categorized as normally employed or unemployed depending on their employment status in 1996. The cumulative number of unemployment after age 50 was also calculated. Logistic regression analysis was used to examine the effect of the association between late-career unemployment and cumulative number of late-career unemployment on all-cause mortality, functional disability, and depression in 2007. The average age of the participants in 1996 was 56.3 years [interquartile range (IQR)=7.0]. A total of 871 participants were in the normally employed group, and 172 participants were in the unemployed group. After adjustment of gender, age, level of education, income, self-rated health and major comorbidities, late-career unemployment was associated with increased all-cause mortality [Odds ratio (OR)=2.79; 95% confidence interval (CI)=1.74-4.47] and functional disability [OR=2.33; 95% CI=1.54-3.55]. The cumulative number of late-career unemployment was also associated with increased all-cause mortality [OR=1.91; 95% CI=1.35-2.70] and functional disability [OR=2.35; 95% CI=1.55-3.55]. Late-career unemployment and cumulative number of late-career unemployment are associated with increased all-cause mortality and functional disability. Older adults should be encouraged to maintain normal employment during the later stage of their career before retirement. Employers should routinely examine the fitness for work of older employees to prevent future unemployment. Copyright © 2016. Published by Elsevier Ireland Ltd.

  13. Stochastic static fault slip inversion from geodetic data with non-negativity and bounds constraints

    NASA Astrophysics Data System (ADS)

    Nocquet, J.-M.

    2018-04-01

    Despite surface displacements observed by geodesy are linear combinations of slip at faults in an elastic medium, determining the spatial distribution of fault slip remains a ill-posed inverse problem. A widely used approach to circumvent the illness of the inversion is to add regularization constraints in terms of smoothing and/or damping so that the linear system becomes invertible. However, the choice of regularization parameters is often arbitrary, and sometimes leads to significantly different results. Furthermore, the resolution analysis is usually empirical and cannot be made independently of the regularization. The stochastic approach of inverse problems (Tarantola & Valette 1982; Tarantola 2005) provides a rigorous framework where the a priori information about the searched parameters is combined with the observations in order to derive posterior probabilities of the unkown parameters. Here, I investigate an approach where the prior probability density function (pdf) is a multivariate Gaussian function, with single truncation to impose positivity of slip or double truncation to impose positivity and upper bounds on slip for interseismic modeling. I show that the joint posterior pdf is similar to the linear untruncated Gaussian case and can be expressed as a Truncated Multi-Variate Normal (TMVN) distribution. The TMVN form can then be used to obtain semi-analytical formulas for the single, two-dimensional or n-dimensional marginal pdf. The semi-analytical formula involves the product of a Gaussian by an integral term that can be evaluated using recent developments in TMVN probabilities calculations (e.g. Genz & Bretz 2009). Posterior mean and covariance can also be efficiently derived. I show that the Maximum Posterior (MAP) can be obtained using a Non-Negative Least-Squares algorithm (Lawson & Hanson 1974) for the single truncated case or using the Bounded-Variable Least-Squares algorithm (Stark & Parker 1995) for the double truncated case. I show that the case of independent uniform priors can be approximated using TMVN. The numerical equivalence to Bayesian inversions using Monte Carlo Markov Chain (MCMC) sampling is shown for a synthetic example and a real case for interseismic modeling in Central Peru. The TMVN method overcomes several limitations of the Bayesian approach using MCMC sampling. First, the need of computer power is largely reduced. Second, unlike Bayesian MCMC based approach, marginal pdf, mean, variance or covariance are obtained independently one from each other. Third, the probability and cumulative density functions can be obtained with any density of points. Finally, determining the Maximum Posterior (MAP) is extremely fast.

  14. Rate coefficients from quantum and quasi-classical cumulative reaction probabilities for the S(1D) + H2 reaction

    NASA Astrophysics Data System (ADS)

    Jambrina, P. G.; Lara, Manuel; Menéndez, M.; Launay, J.-M.; Aoiz, F. J.

    2012-10-01

    Cumulative reaction probabilities (CRPs) at various total angular momenta have been calculated for the barrierless reaction S(1D) + H2 → SH + H at total energies up to 1.2 eV using three different theoretical approaches: time-independent quantum mechanics (QM), quasiclassical trajectories (QCT), and statistical quasiclassical trajectories (SQCT). The calculations have been carried out on the widely used potential energy surface (PES) by Ho et al. [J. Chem. Phys. 116, 4124 (2002), 10.1063/1.1431280] as well as on the recent PES developed by Song et al. [J. Phys. Chem. A 113, 9213 (2009), 10.1021/jp903790h]. The results show that the differences between these two PES are relatively minor and mostly related to the different topologies of the well. In addition, the agreement between the three theoretical methodologies is good, even for the highest total angular momenta and energies. In particular, the good accordance between the CRPs obtained with dynamical methods (QM and QCT) and the statistical model (SQCT) indicates that the reaction can be considered statistical in the whole range of energies in contrast with the findings for other prototypical barrierless reactions. In addition, total CRPs and rate coefficients in the range of 20-1000 K have been calculated using the QCT and SQCT methods and have been found somewhat smaller than the experimental total removal rates of S(1D).

  15. Phylogeographic Insights into a Peripheral Refugium: The Importance of Cumulative Effect of Glaciation on the Genetic Structure of Two Endemic Plants

    PubMed Central

    Zecca, Giovanni; Minuto, Luigi

    2016-01-01

    Quaternary glaciations and mostly last glacial maximum have shaped the contemporary distribution of many species in the Alps. However, in the Maritime and Ligurian Alps a more complex picture is suggested by the presence of many Tertiary paleoendemisms and by the divergence time between lineages in one endemic species predating the Late Pleistocene glaciation. The low number of endemic species studied limits the understanding of the processes that took place within this region. We used species distribution models and phylogeographical methods to infer glacial refugia and to reconstruct the phylogeographical pattern of Silene cordifolia All. and Viola argenteria Moraldo & Forneris. The predicted suitable area for last glacial maximum roughly fitted current known distribution. Our results suggest that separation of the major clades predates the last glacial maximum and the following repeated glacial and interglacial periods probably drove differentiations. The complex phylogeographical pattern observed in the study species suggests that both populations and genotypes extinction was minimal during the last glacial maximum, probably due to the low impact of glaciations and to topographic complexity in this area. This study underlines the importance of cumulative effect of previous glacial cycles in shaping the genetic structure of plant species in Maritime and Ligurian Alps, as expected for a Mediterranean mountain region more than for an Alpine region. PMID:27870888

  16. Inverse probability of treatment-weighted competing risks analysis: an application on long-term risk of urinary adverse events after prostate cancer treatments.

    PubMed

    Bolch, Charlotte A; Chu, Haitao; Jarosek, Stephanie; Cole, Stephen R; Elliott, Sean; Virnig, Beth

    2017-07-10

    To illustrate the 10-year risks of urinary adverse events (UAEs) among men diagnosed with prostate cancer and treated with different types of therapy, accounting for the competing risk of death. Prostate cancer is the second most common malignancy among adult males in the United States. Few studies have reported the long-term post-treatment risk of UAEs and those that have, have not appropriately accounted for competing deaths. This paper conducts an inverse probability of treatment (IPT) weighted competing risks analysis to estimate the effects of different prostate cancer treatments on the risk of UAE, using a matched-cohort of prostate cancer/non-cancer control patients from the Surveillance, Epidemiology and End Results (SEER) Medicare database. Study dataset included men age 66 years or older that are 83% white and had a median follow-up time of 4.14 years. Patients that underwent combination radical prostatectomy and external beam radiotherapy experienced the highest risk of UAE (IPT-weighted competing risks: HR 3.65 with 95% CI (3.28, 4.07); 10-yr. cumulative incidence = 36.5%). Findings suggest that IPT-weighted competing risks analysis provides an accurate estimator of the cumulative incidence of UAE taking into account the competing deaths as well as measured confounding bias.

  17. Long-term consistency in spatial patterns of primate seed dispersal.

    PubMed

    Heymann, Eckhard W; Culot, Laurence; Knogge, Christoph; Noriega Piña, Tony Enrique; Tirado Herrera, Emérita R; Klapproth, Matthias; Zinner, Dietmar

    2017-03-01

    Seed dispersal is a key ecological process in tropical forests, with effects on various levels ranging from plant reproductive success to the carbon storage potential of tropical rainforests. On a local and landscape scale, spatial patterns of seed dispersal create the template for the recruitment process and thus influence the population dynamics of plant species. The strength of this influence will depend on the long-term consistency of spatial patterns of seed dispersal. We examined the long-term consistency of spatial patterns of seed dispersal with spatially explicit data on seed dispersal by two neotropical primate species, Leontocebus nigrifrons and Saguinus mystax (Callitrichidae), collected during four independent studies between 1994 and 2013. Using distributions of dispersal probability over distances independent of plant species, cumulative dispersal distances, and kernel density estimates, we show that spatial patterns of seed dispersal are highly consistent over time. For a specific plant species, the legume Parkia panurensis , the convergence of cumulative distributions at a distance of 300 m, and the high probability of dispersal within 100 m from source trees coincide with the dimension of the spatial-genetic structure on the embryo/juvenile (300 m) and adult stage (100 m), respectively, of this plant species. Our results are the first demonstration of long-term consistency of spatial patterns of seed dispersal created by tropical frugivores. Such consistency may translate into idiosyncratic patterns of regeneration.

  18. Projecting cumulative benefits of multiple river restoration projects: an example from the Sacramento-San Joaquin River system in California

    USGS Publications Warehouse

    Kondolf, G. Mathias; Angermeier, Paul L.; Cummins, Kenneth; Dunne, Thomas; Healey, Michael; Kimmerer, Wim; Moyle, Peter B.; Murphy, Dennis; Patten, Duncan; Railsback, Steve F.; Reed, Denise J.; Spies, Robert B.; Twiss, Robert

    2008-01-01

    Despite increasingly large investments, the potential ecological effects of river restoration programs are still small compared to the degree of human alterations to physical and ecological function. Thus, it is rarely possible to “restore” pre-disturbance conditions; rather restoration programs (even large, well-funded ones) will nearly always involve multiple small projects, each of which can make some modest change to selected ecosystem processes and habitats. At present, such projects are typically selected based on their attributes as individual projects (e.g., consistency with programmatic goals of the funders, scientific soundness, and acceptance by local communities), and ease of implementation. Projects are rarely prioritized (at least explicitly) based on how they will cumulatively affect ecosystem function over coming decades. Such projections require an understanding of the form of the restoration response curve, or at least that we assume some plausible relations and estimate cumulative effects based thereon. Drawing on our experience with the CALFED Bay-Delta Ecosystem Restoration Program in California, we consider potential cumulative system-wide benefits of a restoration activity extensively implemented in the region: isolating/filling abandoned floodplain gravel pits captured by rivers to reduce predation of outmigrating juvenile salmon by exotic warmwater species inhabiting the pits. We present a simple spreadsheet model to show how different assumptions about gravel pit bathymetry and predator behavior would affect the cumulative benefits of multiple pit-filling and isolation projects, and how these insights could help managers prioritize which pits to fill.

  19. The Solar Wind and Geomagnetic Activity as a Function of Time Relative to Corotating Interaction Regions

    NASA Technical Reports Server (NTRS)

    McPherron, Robert L.; Weygand, James

    2006-01-01

    Corotating interaction regions during the declining phase of the solar cycle are the cause of recurrent geomagnetic storms and are responsible for the generation of high fluxes of relativistic electrons. These regions are produced by the collision of a high-speed stream of solar wind with a slow-speed stream. The interface between the two streams is easily identified with plasma and field data from a solar wind monitor upstream of the Earth. The properties of the solar wind and interplanetary magnetic field are systematic functions of time relative to the stream interface. Consequently the coupling of the solar wind to the Earth's magnetosphere produces a predictable sequence of events. Because the streams persist for many solar rotations it should be possible to use terrestrial observations of past magnetic activity to predict future activity. Also the high-speed streams are produced by large unipolar magnetic regions on the Sun so that empirical models can be used to predict the velocity profile of a stream expected at the Earth. In either case knowledge of the statistical properties of the solar wind and geomagnetic activity as a function of time relative to a stream interface provides the basis for medium term forecasting of geomagnetic activity. In this report we use lists of stream interfaces identified in solar wind data during the years 1995 and 2004 to develop probability distribution functions for a variety of different variables as a function of time relative to the interface. The results are presented as temporal profiles of the quartiles of the cumulative probability distributions of these variables. We demonstrate that the storms produced by these interaction regions are generally very weak. Despite this the fluxes of relativistic electrons produced during those storms are the highest seen in the solar cycle. We attribute this to the specific sequence of events produced by the organization of the solar wind relative to the stream interfaces. We also show that there are large quantitative differences in various parameters between the two cycles.

  20. Probabilistic Forecasting of Life and Economic Losses due to Natural Disasters

    NASA Astrophysics Data System (ADS)

    Barton, C. C.; Tebbens, S. F.

    2014-12-01

    The magnitude of natural hazard events such as hurricanes, tornadoes, earthquakes, and floods are traditionally measured by wind speed, energy release, or discharge. In this study we investigate the scaling of the magnitude of individual events of the 20th and 21stcentury in terms of economic and life losses in the United States and worldwide. Economic losses are subdivided into insured and total losses. Some data sets are inflation or population adjusted. Forecasts associated with these events are of interest to insurance, reinsurance, and emergency management agencies. Plots of cumulative size-frequency distributions of economic and life loss are well-fit by power functions and thus exhibit self-similar scaling. This self-similar scaling property permits use of frequent small events to estimate the rate of occurrence of less frequent larger events. Examining the power scaling behavior of loss data for disasters permits: forecasting the probability of occurrence of a disaster over a wide range of years (1 to 10 to 1,000 years); comparing losses associated with one type of disaster to another; comparing disasters in one region to similar disasters in another region; and, measuring the effectiveness of planning and mitigation strategies. In the United States, life losses due to flood and tornado cumulative-frequency distributions have steeper slopes, indicating that frequent smaller events contribute the majority of losses. In contrast, life losses due to hurricanes and earthquakes have shallower slopes, indicating that the few larger events contribute the majority of losses. Disaster planning and mitigation strategies should incorporate these differences.

  1. A new method for detecting, quantifying and monitoring diffuse contamination

    NASA Astrophysics Data System (ADS)

    Fabian, Karl; Reimann, Clemens; de Caritat, Patrice

    2017-04-01

    A new method is presented for detecting and quantifying diffuse contamination at the regional to continental scale. It is based on the analysis of cumulative distribution functions (CDFs) in cumulative probability (CP) plots for spatially representative datasets, preferably containing >1000 samples. Simulations demonstrate how different types of contamination influence elemental CDFs of different sample media. Contrary to common belief, diffuse contamination does not result in exceedingly high element concentrations in regional- to continental-scale datasets. Instead it produces a distinctive shift of concentrations in the background distribution of the studied element resulting in a steeper data distribution in the CP plot. Via either (1) comparing the distribution of an element in top soil samples to the distribution of the same element in bottom soil samples from the same area, taking soil forming processes into consideration, or (2) comparing the distribution of the contaminating element (e.g., Pb) to that of an element with a geochemically comparable behaviour but no contamination source (e.g., Rb or Ba in case of Pb), the relative impact of diffuse contamination on the element concentration can be estimated either graphically in the CP plot via a best fit estimate or quantitatively via a Kolmogorov-Smirnov or Cramer vonMiese test. This is demonstrated using continental-scale geochemical soil datasets from Europe, Australia, and the USA, and a regional scale dataset from Norway. Several different datasets from Europe deliver comparable results at regional to continental scales. The method is also suitable for monitoring diffuse contamination based on the statistical distribution of repeat datasets at the continental scale in a cost-effective manner.

  2. Four Theorems on the Psychometric Function

    PubMed Central

    May, Keith A.; Solomon, Joshua A.

    2013-01-01

    In a 2-alternative forced-choice (2AFC) discrimination task, observers choose which of two stimuli has the higher value. The psychometric function for this task gives the probability of a correct response for a given stimulus difference, . This paper proves four theorems about the psychometric function. Assuming the observer applies a transducer and adds noise, Theorem 1 derives a convenient general expression for the psychometric function. Discrimination data are often fitted with a Weibull function. Theorem 2 proves that the Weibull “slope” parameter, , can be approximated by , where is the of the Weibull function that fits best to the cumulative noise distribution, and depends on the transducer. We derive general expressions for and , from which we derive expressions for specific cases. One case that follows naturally from our general analysis is Pelli's finding that, when , . We also consider two limiting cases. Theorem 3 proves that, as sensitivity improves, 2AFC performance will usually approach that for a linear transducer, whatever the actual transducer; we show that this does not apply at signal levels where the transducer gradient is zero, which explains why it does not apply to contrast detection. Theorem 4 proves that, when the exponent of a power-function transducer approaches zero, 2AFC performance approaches that of a logarithmic transducer. We show that the power-function exponents of 0.4–0.5 fitted to suprathreshold contrast discrimination data are close enough to zero for the fitted psychometric function to be practically indistinguishable from that of a log transducer. Finally, Weibull reflects the shape of the noise distribution, and we used our results to assess the recent claim that internal noise has higher kurtosis than a Gaussian. Our analysis of for contrast discrimination suggests that, if internal noise is stimulus-independent, it has lower kurtosis than a Gaussian. PMID:24124456

  3. An efficient distribution method for nonlinear transport problems in highly heterogeneous stochastic porous media

    NASA Astrophysics Data System (ADS)

    Ibrahima, Fayadhoi; Meyer, Daniel; Tchelepi, Hamdi

    2016-04-01

    Because geophysical data are inexorably sparse and incomplete, stochastic treatments of simulated responses are crucial to explore possible scenarios and assess risks in subsurface problems. In particular, nonlinear two-phase flows in porous media are essential, yet challenging, in reservoir simulation and hydrology. Adding highly heterogeneous and uncertain input, such as the permeability and porosity fields, transforms the estimation of the flow response into a tough stochastic problem for which computationally expensive Monte Carlo (MC) simulations remain the preferred option.We propose an alternative approach to evaluate the probability distribution of the (water) saturation for the stochastic Buckley-Leverett problem when the probability distributions of the permeability and porosity fields are available. We give a computationally efficient and numerically accurate method to estimate the one-point probability density (PDF) and cumulative distribution functions (CDF) of the (water) saturation. The distribution method draws inspiration from a Lagrangian approach of the stochastic transport problem and expresses the saturation PDF and CDF essentially in terms of a deterministic mapping and the distribution and statistics of scalar random fields. In a large class of applications these random fields can be estimated at low computational costs (few MC runs), thus making the distribution method attractive. Even though the method relies on a key assumption of fixed streamlines, we show that it performs well for high input variances, which is the case of interest. Once the saturation distribution is determined, any one-point statistics thereof can be obtained, especially the saturation average and standard deviation. Moreover, the probability of rare events and saturation quantiles (e.g. P10, P50 and P90) can be efficiently derived from the distribution method. These statistics can then be used for risk assessment, as well as data assimilation and uncertainty reduction in the prior knowledge of input distributions. We provide various examples and comparisons with MC simulations to illustrate the performance of the method.

  4. Brain Vulnerability to Repeated Blast Overpressure and Polytrauma

    DTIC Science & Technology

    2015-10-01

    characterization of the mouse model of repeated blast also found no cumula- tive effect of repeated blast on cortical levels of reactive oxygen species [39]. C...overpressure in rats to investigate the cumulative effects of multiple blast exposures on neurologic status, neurobehavioral function, and brain...preclinical model of blast overpressure in rats to investigate the cumulative effects of multiple blast exposures using neurological, neurochemical

  5. Prokinetics for the treatment of functional dyspepsia: Bayesian network meta-analysis.

    PubMed

    Yang, Young Joo; Bang, Chang Seok; Baik, Gwang Ho; Park, Tae Young; Shin, Suk Pyo; Suk, Ki Tae; Kim, Dong Joon

    2017-06-26

    Controversies persist regarding the effect of prokinetics for the treatment of functional dyspepsia (FD). This study aimed to assess the comparative efficacy of prokinetic agents for the treatment of FD. Randomized controlled trials (RCTs) of prokinetics for the treatment of FD were identified from core databases. Symptom response rates were extracted and analyzed using odds ratios (ORs). A Bayesian network meta-analysis was performed using the Markov chain Monte Carlo method in WinBUGS and NetMetaXL. In total, 25 RCTs, which included 4473 patients with FD who were treated with 6 different prokinetics or placebo, were identified and analyzed. Metoclopramide showed the best surface under the cumulative ranking curve (SUCRA) probability (92.5%), followed by trimebutine (74.5%) and mosapride (63.3%). However, the therapeutic efficacy of metoclopramide was not significantly different from that of trimebutine (OR:1.32, 95% credible interval: 0.27-6.06), mosapride (OR: 1.99, 95% credible interval: 0.87-4.72), or domperidone (OR: 2.04, 95% credible interval: 0.92-4.60). Metoclopramide showed better efficacy than itopride (OR: 2.79, 95% credible interval: 1.29-6.21) and acotiamide (OR: 3.07, 95% credible interval: 1.43-6.75). Domperidone (SUCRA probability 62.9%) showed better efficacy than itopride (OR: 1.37, 95% credible interval: 1.07-1.77) and acotiamide (OR: 1.51, 95% credible interval: 1.04-2.18). Metoclopramide, trimebutine, mosapride, and domperidone showed better efficacy for the treatment of FD than itopride or acotiamide. Considering the adverse events related to metoclopramide or domperidone, the short-term use of these agents or the alternative use of trimebutine or mosapride could be recommended for the symptomatic relief of FD.

  6. Cumulative exposure to dust and gases as determinants of lung function decline in tunnel construction workers

    PubMed Central

    Bakke, B; Ulvestad, B; Stewart, P; Eduard, W

    2004-01-01

    Aims: To study the relation between lung function decrease and cumulative exposure to dust and gases in tunnel construction workers. Methods: A total of 651 male construction workers (drill and blast workers, tunnel concrete workers, shotcreting operators, and tunnel boring machine workers) were followed up by spirometric measurements in 1989–2002 for an average of six years. Outdoor concrete workers, foremen, and engineers served as a low exposed referent population. Results: The between worker component of variability was considerably reduced within the job groups compared to the whole population, suggesting that the workers within job groups had similar exposure levels. The annual decrease in FEV1 in low-exposed non-smoking workers was 21 ml and 24 ml in low-exposed ever smokers. The annual decrease in FEV1 in tunnel construction workers was 20–31 ml higher than the low exposed workers depending on job group for both non-smokers and ever smokers. After adjustment for age and observation time, cumulative exposure to nitrogen dioxide showed the strongest association with a decrease in FEV1 in both non-smokers, and ever smokers. Conclusion: Cumulative exposure to nitrogen dioxide appeared to be a major risk factor for lung function decreases in these tunnel construction workers, although other agents may have contributed to the observed effect. Contact with blasting fumes should be avoided, diesel exhaust emissions should be reduced, and respiratory devices should be used to protect workers against dust and nitrogen dioxide exposure. PMID:14985522

  7. Universal Off-Equilibrium Scaling of Critical Cumulants in the QCD Phase Diagram

    DOE PAGES

    Mukherjee, Swagato; Venugopalan, Raju; Yin, Yi

    2016-11-23

    Exploiting the universality between the QCD critical point and the three-dimensional Ising model, closed form expressions derived for nonequilibrium critical cumulants on the crossover side of the critical point reveal that they can differ in both magnitude and sign from equilibrium expectations. Here, we demonstrate here that key elements of the Kibble-Zurek framework of nonequilibrium phase transitions can be employed to describe the dynamics of these critical cumulants. Lastly, our results suggest that observables sensitive to critical dynamics in heavy-ion collisions should be expressible as universal scaling functions, thereby providing powerful model-independent guidance in searches for the QCD critical point.

  8. Trajectory phase transitions and dynamical Lee-Yang zeros of the Glauber-Ising chain.

    PubMed

    Hickey, James M; Flindt, Christian; Garrahan, Juan P

    2013-07-01

    We examine the generating function of the time-integrated energy for the one-dimensional Glauber-Ising model. At long times, the generating function takes on a large-deviation form and the associated cumulant generating function has singularities corresponding to continuous trajectory (or "space-time") phase transitions between paramagnetic trajectories and ferromagnetically or antiferromagnetically ordered trajectories. In the thermodynamic limit, the singularities make up a whole curve of critical points in the complex plane of the counting field. We evaluate analytically the generating function by mapping the generator of the biased dynamics to a non-Hermitian Hamiltonian of an associated quantum spin chain. We relate the trajectory phase transitions to the high-order cumulants of the time-integrated energy which we use to extract the dynamical Lee-Yang zeros of the generating function. This approach offers the possibility to detect continuous trajectory phase transitions from the finite-time behavior of measurable quantities.

  9. Expert Elicitations of 2100 Emission of CO2

    NASA Astrophysics Data System (ADS)

    Ho, Emily; Bosetti, Valentina; Budescu, David; Keller, Klaus; van Vuuren, Detlef

    2017-04-01

    Emission scenarios such as Shared Socioeconomic Pathways (SSPs) and Representative Concentration Pathways (RCPs) are used intensively for climate research (e.g. climate change projections) and policy analysis. While the range of these scenarios provides an indication of uncertainty, these scenarios are typically not associated with probability values. Some studies (e.g. Vuuren et al, 2007; Gillingham et al., 2015) took a different approach associating baseline emission pathways (conditionally) with probability distributions. This paper summarizes three studies where climate change experts were asked to conduct pair-wise comparisons of possible ranges of 2100 greenhouse gas emissions and rate the relative likelihood of the ranges. The elicitation was performed under two sets of assumptions: 1) a situation where no climate policies are introduced beyond the ones already in place (baseline scenario), and 2) a situation in which countries have ratified the voluntary policies in line with the long term target embedded in the 2015 Paris Agreement. These indirect relative judgments were used to construct subjective cumulative distribution functions. We show that by using a ratio scaling method that invokes relative likelihoods of scenarios, a subjective probability distribution can be derived for each expert that expresses their beliefs in the projected greenhouse gas emissions range in 2100. This method is shown to elicit stable estimates that require minimal adjustment and is relatively invariant to the partition of the domain of interest. Experts also rated the method as being easy and intuitive to use. We also report results of a study that allowed participants to choose their own ranges of greenhouse gas emissions to remove potential anchoring bias. We discuss the implications of the use of this method for facilitating comparison and communication of beliefs among diverse users of climate science research.

  10. Alternate methods for FAAT S-curve generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaufman, A.M.

    The FAAT (Foreign Asset Assessment Team) assessment methodology attempts to derive a probability of effect as a function of incident field strength. The probability of effect is the likelihood that the stress put on a system exceeds its strength. In the FAAT methodology, both the stress and strength are random variables whose statistical properties are estimated by experts. Each random variable has two components of uncertainty: systematic and random. The systematic uncertainty drives the confidence bounds in the FAAT assessment. Its variance can be reduced by improved information. The variance of the random uncertainty is not reducible. The FAAT methodologymore » uses an assessment code called ARES to generate probability of effect curves (S-curves) at various confidence levels. ARES assumes log normal distributions for all random variables. The S-curves themselves are log normal cumulants associated with the random portion of the uncertainty. The placement of the S-curves depends on confidence bounds. The systematic uncertainty in both stress and strength is usually described by a mode and an upper and lower variance. Such a description is not consistent with the log normal assumption of ARES and an unsatisfactory work around solution is used to obtain the required placement of the S-curves at each confidence level. We have looked into this situation and have found that significant errors are introduced by this work around. These errors are at least several dB-W/cm{sup 2} at all confidence levels, but they are especially bad in the estimate of the median. In this paper, we suggest two alternate solutions for the placement of S-curves. To compare these calculational methods, we have tabulated the common combinations of upper and lower variances and generated the relevant S-curves offsets from the mode difference of stress and strength.« less

  11. An improved multilevel Monte Carlo method for estimating probability distribution functions in stochastic oil reservoir simulations

    DOE PAGES

    Lu, Dan; Zhang, Guannan; Webster, Clayton G.; ...

    2016-12-30

    In this paper, we develop an improved multilevel Monte Carlo (MLMC) method for estimating cumulative distribution functions (CDFs) of a quantity of interest, coming from numerical approximation of large-scale stochastic subsurface simulations. Compared with Monte Carlo (MC) methods, that require a significantly large number of high-fidelity model executions to achieve a prescribed accuracy when computing statistical expectations, MLMC methods were originally proposed to significantly reduce the computational cost with the use of multifidelity approximations. The improved performance of the MLMC methods depends strongly on the decay of the variance of the integrand as the level increases. However, the main challengemore » in estimating CDFs is that the integrand is a discontinuous indicator function whose variance decays slowly. To address this difficult task, we approximate the integrand using a smoothing function that accelerates the decay of the variance. In addition, we design a novel a posteriori optimization strategy to calibrate the smoothing function, so as to balance the computational gain and the approximation error. The combined proposed techniques are integrated into a very general and practical algorithm that can be applied to a wide range of subsurface problems for high-dimensional uncertainty quantification, such as a fine-grid oil reservoir model considered in this effort. The numerical results reveal that with the use of the calibrated smoothing function, the improved MLMC technique significantly reduces the computational complexity compared to the standard MC approach. Finally, we discuss several factors that affect the performance of the MLMC method and provide guidance for effective and efficient usage in practice.« less

  12. Productive Activities and Development of Frailty in Older Adults

    PubMed Central

    Jung, Yunkyung; Gruenewald, Tara L.; Seeman, Teresa E.

    2010-01-01

    Objective. Our aim was to examine whether engagement in productive activities, including volunteering, paid work, and childcare, protects older adults against the development of geriatric frailty. Methods. Data from the first (1988) and second (1991) waves of the MacArthur Study of Successful Aging, a prospective cohort study of high-functioning older adults aged 70–79 years (n = 1,072), was used to examine the hypothesis that engagement in productive activities is associated with lower levels of frailty 3 years later. Results. Engagement in productive activities at baseline was associated with a lower cumulative odds of frailty 3 years later in unadjusted models (odds ratio [OR] = 0.74, 95% confidence interval [CI] = 0.58–0.96) but not after adjusting for age, disability, and cognitive function (adjusted OR = 0.78, 95% CI = 0.60–1.01). Examination of productive activity domains showed that volunteering (but neither paid work nor childcare) was associated with a lower cumulative odds of frailty after adjusting for age, disability, and cognitive function. This relationship diminished and was no longer statistically significant after adjusting for personal mastery and religious service attendance. Discussion. Though high-functioning older adults who participate in productive activities are less likely to become frail, after adjusting for age, disability, and cognitive function, only volunteering is associated with a lower cumulative odds of frailty. PMID:20018794

  13. On the analysis of Canadian Holstein dairy cow lactation curves using standard growth functions.

    PubMed

    López, S; France, J; Odongo, N E; McBride, R A; Kebreab, E; AlZahal, O; McBride, B W; Dijkstra, J

    2015-04-01

    Six classical growth functions (monomolecular, Schumacher, Gompertz, logistic, Richards, and Morgan) were fitted to individual and average (by parity) cumulative milk production curves of Canadian Holstein dairy cows. The data analyzed consisted of approximately 91,000 daily milk yield records corresponding to 122 first, 99 second, and 92 third parity individual lactation curves. The functions were fitted using nonlinear regression procedures, and their performance was assessed using goodness-of-fit statistics (coefficient of determination, residual mean squares, Akaike information criterion, and the correlation and concordance coefficients between observed and adjusted milk yields at several days in milk). Overall, all the growth functions evaluated showed an acceptable fit to the cumulative milk production curves, with the Richards equation ranking first (smallest Akaike information criterion) followed by the Morgan equation. Differences among the functions in their goodness-of-fit were enlarged when fitted to average curves by parity, where the sigmoidal functions with a variable point of inflection (Richards and Morgan) outperformed the other 4 equations. All the functions provided satisfactory predictions of milk yield (calculated from the first derivative of the functions) at different lactation stages, from early to late lactation. The Richards and Morgan equations provided the most accurate estimates of peak yield and total milk production per 305-d lactation, whereas the least accurate estimates were obtained with the logistic equation. In conclusion, classical growth functions (especially sigmoidal functions with a variable point of inflection) proved to be feasible alternatives to fit cumulative milk production curves of dairy cows, resulting in suitable statistical performance and accurate estimates of lactation traits. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Cumulative Exposure to Systolic Blood Pressure During Young Adulthood Through Midlife and the Urine Albumin-to-Creatinine Ratio at Midlife.

    PubMed

    Kramer, Holly; Colangelo, Laura; Lewis, Cora E; Jacobs, David R; Pletcher, Mark; Bibbins-Domingo, Kirstin; Chang, Alex; Siscovick, David; Shlipak, Michael; Peralta, Carmen A; Bansal, Nisha; Muntner, Paul; Liu, Kiang

    2017-05-01

    Higher blood pressure during young adulthood may increase cardiovascular and kidney disease risk later in life. This study examined the association of cumulative systolic blood pressure (SBP) exposure during young adulthood through midlife with urine albumin-to-creatinine ratios (ACR) measured during midlife. We used data from the Coronary Artery Risk Development in Young Adults (CARDIA) study, a biracial cohort recruited in 4 urban areas during years 1985-1986. Cumulative SBP was calculated as the average SBP between 2 exams multiplied by years between exams over 20 year years. ACR was measured 20 years after baseline when participants were age 43-50 years (midlife). A generalized additive model was used to examine the association of log ACR as a function of cumulative SBP with adjustment for covariates including SBP measured concurrently with ACR. Cumulative SBP ranged from a low of 1,671 to a high of 3,260 mm Hg. Participants in the highest cumulative SBP quartile were more likely to be male (61.4% vs. 20.7%; P < 0.001), Black (61.5% vs. 25.6%; P < 0.001) and have elevated ACR (18.7% vs. 4.8%; P < 0.001) vs. lowest quartile. Spline regression curves of ACR vs. cumulative SBP demonstrated an inflection point in ACR with cumulative SBP levels >2,350 mm Hg with linear increases in ACR above this threshold. Adjusted geometric mean ACR values were significantly higher with cumulative SBP ≥2,500 vs. <2500 (9.18 [1.06] vs. 6.92 [1.02]; P < 0.0001). Higher SBP during young adulthood through midlife is associated with higher ACR during midlife. © American Journal of Hypertension, Ltd 2017. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. A two-parameter design storm for Mediterranean convective rainfall

    NASA Astrophysics Data System (ADS)

    García-Bartual, Rafael; Andrés-Doménech, Ignacio

    2017-05-01

    The following research explores the feasibility of building effective design storms for extreme hydrological regimes, such as the one which characterizes the rainfall regime of the east and south-east of the Iberian Peninsula, without employing intensity-duration-frequency (IDF) curves as a starting point. Nowadays, after decades of functioning hydrological automatic networks, there is an abundance of high-resolution rainfall data with a reasonable statistic representation, which enable the direct research of temporal patterns and inner structures of rainfall events at a given geographic location, with the aim of establishing a statistical synthesis directly based on those observed patterns. The authors propose a temporal design storm defined in analytical terms, through a two-parameter gamma-type function. The two parameters are directly estimated from 73 independent storms identified from rainfall records of high temporal resolution in Valencia (Spain). All the relevant analytical properties derived from that function are developed in order to use this storm in real applications. In particular, in order to assign a probability to the design storm (return period), an auxiliary variable combining maximum intensity and total cumulated rainfall is introduced. As a result, for a given return period, a set of three storms with different duration, depth and peak intensity are defined. The consistency of the results is verified by means of comparison with the classic method of alternating blocks based on an IDF curve, for the above mentioned study case.

  16. A novel risk score for mortality in renal transplant recipients beyond the first posttransplant year.

    PubMed

    Hernández, Domingo; Sánchez-Fructuoso, Ana; González-Posada, José Manuel; Arias, Manuel; Campistol, Josep María; Rufino, Margarita; Morales, José María; Moreso, Francesc; Pérez, Germán; Torres, Armando; Serón, Daniel

    2009-09-27

    All-cause mortality is high after kidney transplantation (KT), but no prognostic index has focused on predicting mortality in KT using baseline and emergent comorbidity after KT. A total of 4928 KT recipients were used to derive a risk score predicting mortality. Patients were randomly assigned to two groups: a modeling population (n=2452), used to create a new index, and a testing population (n=2476), used to test this index. Multivariate Cox regression model coefficients of baseline (age, weight, time on dialysis, diabetes, hepatitis C, and delayed graft function) and emergent comorbidity within the first posttransplant year (diabetes, proteinuria, renal function, and immunosuppressants) were used to weigh each variable in the calculation of the score and allocated into risk quartiles. The probability of death at 3 years, estimated by baseline cumulative hazard function from the Cox model [P (death)=1-0.993592764 (exp(score/100)], increased from 0.9% in the lowest-risk quartile (score=40) to 4.7% in the highest risk-quartile (score=200). The observed incidence of death increased with increasing risk quartiles in testing population (log-rank analysis, P<0.0001). The overall C-index was 0.75 (95% confidence interval: 0.72-0.78) and 0.74 (95% confidence interval: 0.70-0.77) in both populations, respectively. This new index is an accurate tool to identify high-risk patients for mortality after KT.

  17. Patients with Revision Modern Megaprostheses of the Distal Femur Have Improved Disease-Specific and Health-Related Outcomes Compared to Those with Primary Replacements.

    PubMed

    Heyberger, Clémence; Auberger, Guillaume; Babinet, Antoine; Anract, Philippe; Biau, David J

    2017-12-21

    We asked whether there would be any difference between primary and revision modern cemented fixed hinge megaprosthesis of the distal femur in function and activity-related outcomes following treatment of a bone tumor. An identical custom-made fixed hinge cemented megaprosthesis with a hydroxyapatite collar was used in all cases. The main outcomes were joint-specific function, disease-specific activity, and health-related quality of life. Implant survival was also evaluated. Patients in the revision group performed slightly better than patients in the primary group on disease-specific (Toronto Extremity Salvage Score, p  = 0.033; Musculoskeletal Tumor Society, p  = 0.072) and health-related outcomes (Short Form 36 [SF-36] physical component, p  = 0.085; SF-36 mental component, p  = 0.069) but not on joint-specific outcomes (Knee Society Score, p  = 0.94). The cumulative probabilities of revision for any reason were 14.5% (7-25%) at 5 years with no statistically significant difference between primary and revision procedures ( p  = 0.77). In conclusion, patients undergoing a revision have similar joint-specific functional outcome but improved disease-specific and health-related outcomes. Implant survival are similar between groups. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, Larry K.; Gustafson, William I.; Kassianov, Evgueni I.

    A new treatment for shallow clouds has been introduced into the Weather Research and Forecasting (WRF) model. The new scheme, called the cumulus potential (CuP) scheme, replaces the ad-hoc trigger function used in the Kain-Fritsch cumulus parameterization with a trigger function related to the distribution of temperature and humidity in the convective boundary layer via probability density functions (PDFs). An additional modification to the default version of WRF is the computation of a cumulus cloud fraction based on the time scales relevant for shallow cumuli. Results from three case studies over the U.S. Department of Energy’s Atmospheric Radiation Measurement (ARM)more » site in north central Oklahoma are presented. These days were selected because of the presence of shallow cumuli over the ARM site. The modified version of WRF does a much better job predicting the cloud fraction and the downwelling shortwave irradiance thancontrol simulations utilizing the default Kain-Fritsch scheme. The modified scheme includes a number of additional free parameters, including the number and size of bins used to define the PDF, the minimum frequency of a bin within the PDF before that bin is considered for shallow clouds to form, and the critical cumulative frequency of bins required to trigger deep convection. A series of tests were undertaken to evaluate the sensitivity of the simulations to these parameters. Overall, the scheme was found to be relatively insensitive to each of the parameters.« less

  19. Design and development of a ceramic radial turbine for the AGT101

    NASA Technical Reports Server (NTRS)

    Finger, D. G.; Gupta, S. K.

    1982-01-01

    An acceptable and feasible ceramic turbine wheel design has been achieved, and the relevant temperature, stress, and success probability analyses are discussed. The design is described, the materials selection presented, and the engine cycle conditions analysis parameters shown. Measured MOR four-point strengths are indicated for room and elevated temperatures, and engine conditions are analyzed for various cycle states, materials, power states, turbine inlet temperatures, and speeds. An advanced gas turbine ceramic turbine rotor thermal and stress model is developed, and cumulative probability of survival is shown for first and third-year properties of SiC and Si3N4 rotors under different operating conditions, computed for both blade and hub regions. Temperature and stress distributions for steady-state and worst-case shutdown transients are depicted.

  20. Changes in risk of immediate adverse reactions to iodinated contrast media by repeated administrations in patients with hepatocellular carcinoma.

    PubMed

    Fujiwara, Naoto; Tateishi, Ryosuke; Akahane, Masaaki; Taguri, Masataka; Minami, Tatsuya; Mikami, Shintaro; Sato, Masaya; Uchino, Koji; Uchino, Kouji; Enooku, Kenichiro; Kondo, Yuji; Asaoka, Yoshinari; Yamashiki, Noriyo; Goto, Tadashi; Shiina, Shuichiro; Yoshida, Haruhiko; Ohtomo, Kuni; Koike, Kazuhiko

    2013-01-01

    To elucidate whether repeated exposures to iodinated contrast media increase the risk of adverse reaction. We retrospectively reviewed 1,861 patients with hepatocellular carcinoma who visited authors' institution, a tertiary referral center, between 2004 and 2008. We analyzed cumulative probability of adverse reactions and risk factors. We categorized all symptoms into hypersensitivity reactions, physiologic reactions, and other reactions, according to the American College of Radiology guidelines, and evaluated each category as an event. We estimated the association between hazard for adverse reactions and the number of cumulative exposures to contrast media. We also evaluated subsequent contrast media injections and adverse reactions. There were 23,684 contrast media injections in 1,729 patients. One hundred and thirty-two patients were excluded because they were given no contrast media during the study period. Adverse reactions occurred in 196 (0.83%) patients. The cumulative incidence at 10(th), 20(th), and 30(th) examination was 7.9%, 15.2%, and 24.1%, respectively. Presence of renal impairment was found to be one of risk factors for adverse reactions. The estimated hazard of overall adverse reaction gradually decreased until around 10(th) exposure and rose with subsequent exposures. The estimated hazard of hypersensitivity showed V-shaped change with cumulative number of exposures. The estimated hazard of physiologic reaction had a tendency toward decreasing and that of other reaction had a tendency toward increasing. Second adverse reaction was more severe than the initial in only one among 130 patients receiving subsequent injections. Repeated exposures to iodinated contrast media increase the risk of adverse reaction.

  1. Derivation and validation of a discharge disposition predicting model after acute stroke.

    PubMed

    Tseng, Hung-Pin; Lin, Feng-Jenq; Chen, Pi-Tzu; Mou, Chih-Hsin; Lee, Siu-Pak; Chang, Chun-Yuan; Chen, An-Chih; Liu, Chung-Hsiang; Yeh, Chung-Hsin; Tsai, Song-Yen; Hsiao, Yu-Jen; Lin, Ching-Huang; Hsu, Shih-Pin; Yu, Shih-Chieh; Hsu, Chung-Y; Sung, Fung-Chang

    2015-06-01

    Discharge disposition planning is vital for poststroke patients. We investigated clinical factors associated with discharging patients to nursing homes, using the Taiwan Stroke Registry data collected from 39 major hospitals. We randomly assigned 21,575 stroke inpatients registered from 2006 to 2008 into derivation and validation groups at a 3-to-1 ratio. We used the derivation group to develop a prediction model by measuring cumulative risk scores associated with potential predictors: age, sex, hypertension, diabetes mellitus, heart diseases, stroke history, snoring, main caregivers, stroke types, and National Institutes of Health Stroke Scale (NIHSS). Probability of nursing home care and odds ratio (OR) of nursing home care relative to home care by cumulative risk scores were measured for the prediction. The area under the receiver operating characteristic curve (AUROC) was used to assess the model discrimination against the validation group. Except for hypertension, all remaining potential predictors were significant independent predictors associated with stroke patient disposition to nursing home care after discharge from hospitals. The risk sharply increased with age and NIHSS. Patients with a cumulative risk score of 15 or more had an OR of 86.4 for the nursing home disposition. The AUROC plots showed similar areas under curves for the derivation group (.86, 95% confidence interval [CI], .85-.87) and for the validation group (.84, 95% CI, .83-.86). The cumulative risk score is an easy-to-estimate tool for preparing stroke patients and their family for disposition on discharge. Copyright © 2015 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  2. Understanding the Relation of Low Income to HPA-Axis Functioning in Preschool Children: Cumulative Family Risk and Parenting as Pathways to Disruptions in Cortisol

    ERIC Educational Resources Information Center

    Zalewski, Maureen; Lengua, Liliana J.; Kiff, Cara J.; Fisher, Philip A.

    2012-01-01

    This study examined the relation of low income and poverty to cortisol levels, and tested potential pathways from low income to disruptions in cortisol through cumulative family risk and parenting. The sample of 306 mothers and their preschool children included 29 % families at or near poverty, 27 % families below the median income, and the…

  3. The Effects of Small Sample Size on Identifying Polytomous DIF Using the Liu-Agresti Estimator of the Cumulative Common Odds Ratio

    ERIC Educational Resources Information Center

    Carvajal, Jorge; Skorupski, William P.

    2010-01-01

    This study is an evaluation of the behavior of the Liu-Agresti estimator of the cumulative common odds ratio when identifying differential item functioning (DIF) with polytomously scored test items using small samples. The Liu-Agresti estimator has been proposed by Penfield and Algina as a promising approach for the study of polytomous DIF but no…

  4. Measurement of higher cumulants of net-charge multiplicity distributions in Au + Au collisions at s N N = 7.7 – 200 GeV

    DOE PAGES

    Adare, A.; Afanasiev, S.; Aidala, C.; ...

    2016-01-19

    Our report presents the measurement of cumulants (C n,n=1,...,4) of the net-charge distributions measured within pseudorapidity (|η|<0.35) in Au+Au collisions at √s NN=7.7–200GeV with the PHENIX experiment at the Relativistic Heavy Ion Collider. The ratios of cumulants (e.g., C 1/C 2, C 3/C 1) of the net-charge distributions, which can be related to volume independent susceptibility ratios, are studied as a function of centrality and energy. These quantities are important to understand the quantum-chromodynamics phase diagram and possible existence of a critical end point. The measured values are very well described by expectation from negative binomial distributions. We do notmore » observe any nonmonotonic behavior in the ratios of the cumulants as a function of collision energy. These measured values of C 1/C 2 and C 3/C 1 can be directly compared to lattice quantum-chromodynamics calculations and thus allow extraction of both the chemical freeze-out temperature and the baryon chemical potential at each center-of-mass energy. Moreover, the extracted baryon chemical potentials are in excellent agreement with a thermal-statistical analysis model.« less

  5. A statistical physics view of pitch fluctuations in the classical music from Bach to Chopin: evidence for scaling.

    PubMed

    Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping

    2013-01-01

    Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.

  6. Atmospheric Profiles, Clouds and the Evolution of Sea Ice Cover in the Beaufort and Chukchi Seas: Atmospheric Observations and Modeling as Part of the Seasonal Ice Zone Reconnaissance Surveys

    DTIC Science & Technology

    2017-06-04

    Cover in the Beaufort and Chukchi Seas: Atmospheric Observations and Modeling as Part of the Seasonal Ice Zone Reconnaissance Surveys Axel...of the atmospheric component of the Seasonal Ice Zone Reconnaissance Survey project (SIZRS). Combined with oceanographic and sea ice components of...indicate cumulative probabilities. Vertical lines show median errors for forecast and climatology, respectively Figure 7 Correlation coefficient

  7. Calculation of Cumulative Distributions and Detection Probabilities in Communications and Optics.

    DTIC Science & Technology

    1984-10-01

    the CMLD . As an example of a particular result, Figure 8.1 shows the additional SNR required (often called the CFAR loss) for the MLD, CMLD , and OSD in...the background noise level is known. Notice that although the CFAR loss increases with INR for the MLD, the CMLD and OSD have a bounded loss as the INR...Radar Detectors (J. A. Ritcey) Mean-level detectors (MLD) are commonly used in radar to maintain a constant -*! false-alarm rate ( CFAR ) when the

  8. Calculation of Cumulative Distributions and Detection Probabilities in Communications and Optics.

    DTIC Science & Technology

    1986-03-31

    result, Figure 3.1 shows the additional SNR required (often called the CFAR loss) for the MLD, CMLD , and OSD in a multiple target environment to...Notice that although the CFAR loss increases with INR for the MLD, the CMLD and OSD have a bounded loss as the INR + w. These results have been more...false-alarm rate ( CFAR ) when the background noise level is unknown. In Section 2 we described the application of saddlepoint integration techniques to

  9. Allele doses of apolipoprotein E type {epsilon}4 in sporadic late-onset Alzheimer`s disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucotte, G.; Aouizerate, A.; Gerard, N.

    1995-12-18

    Apoliprotein E, type {epsilon}4 allele (ApoE-{epsilon}4) is associated with late-onset sporadic Alzheimer`s disease (AD). We have found that the cumulative probability of remaining unaffected over time decreases for each dose of ApoE-{epsilon}4 in sporadic, late-onset French AD. The effect of genotypes on age at onset of AD was analyzed using the product limit method, to compare unaffected groups during aging. 26 refs., 2 figs., 1 tab.

  10. Modeling cumulative dose and exposure duration provided insights regarding the associations between benzodiazepines and injuries.

    PubMed

    Abrahamowicz, Michal; Bartlett, Gillian; Tamblyn, Robyn; du Berger, Roxane

    2006-04-01

    Accurate assessment of medication impact requires modeling cumulative effects of exposure duration and dose; however, postmarketing studies usually represent medication exposure by baseline or current use only. We propose new methods for modeling various aspects of medication use history and employment of them to assess the adverse effects of selected benzodiazepines. Time-dependent measures of cumulative dose or duration of use, with weighting of past exposures by recency, were proposed. These measures were then included in alternative versions of the multivariable Cox model to analyze the risk of fall related injuries among the elderly new users of three benzodiazepines (nitrazepam, temazepam, and flurazepam) in Quebec. Akaike's information criterion (AIC) was used to select the most predictive model for a given benzodiazepine. The best-fitting model included a combination of cumulative duration and current dose for temazepam, and cumulative dose for flurazepam and nitrazepam, with different weighting functions. The window of clinically relevant exposure was shorter for flurazepam than for the two other products. Careful modeling of the medication exposure history may enhance our understanding of the mechanisms underlying their adverse effects.

  11. The Role of Cumulative Trauma, Betrayal, and Appraisals in Understanding Trauma Symptomatology.

    PubMed

    Martin, Christina Gamache; Cromer, Lisa Demarni; Deprince, Anne P; Freyd, Jennifer J

    2013-03-01

    Poor psychological outcomes are common among trauma survivors, yet not all survivors experience adverse sequelae. The current study examined links between cumulative trauma exposure as a function of the level of betrayal (measured by the relational closeness of the survivor and the perpetrator), trauma appraisals, gender, and trauma symptoms. Participants were 273 college students who reported experiencing at least one traumatic event on a trauma checklist. Three cumulative indices were constructed to assess the number of different types of traumas experienced that were low (LBTs), moderate (MBTs), or high in betrayal (HBTs). Greater trauma exposure was related to more symptoms of depression, dissociation, and PTSD, with exposure to HBTs contributing the most. Women were more likely to experience HBTs than men, but there were no gender differences in trauma-related symptoms. Appraisals of trauma were predictive of trauma-related symptoms over and above the effects explained by cumulative trauma at each level of betrayal. The survivor's relationship with the perpetrator, the effect of cumulative trauma, and their combined impact on trauma symptomatology are discussed.

  12. The Role of Cumulative Trauma, Betrayal, and Appraisals in Understanding Trauma Symptomatology

    PubMed Central

    Martin, Christina Gamache; Cromer, Lisa DeMarni; DePrince, Anne P.; Freyd, Jennifer J.

    2012-01-01

    Poor psychological outcomes are common among trauma survivors, yet not all survivors experience adverse sequelae. The current study examined links between cumulative trauma exposure as a function of the level of betrayal (measured by the relational closeness of the survivor and the perpetrator), trauma appraisals, gender, and trauma symptoms. Participants were 273 college students who reported experiencing at least one traumatic event on a trauma checklist. Three cumulative indices were constructed to assess the number of different types of traumas experienced that were low (LBTs), moderate (MBTs), or high in betrayal (HBTs). Greater trauma exposure was related to more symptoms of depression, dissociation, and PTSD, with exposure to HBTs contributing the most. Women were more likely to experience HBTs than men, but there were no gender differences in trauma-related symptoms. Appraisals of trauma were predictive of trauma-related symptoms over and above the effects explained by cumulative trauma at each level of betrayal. The survivor’s relationship with the perpetrator, the effect of cumulative trauma, and their combined impact on trauma symptomatology are discussed. PMID:23542882

  13. An adiabatic linearized path integral approach for quantum time-correlation functions II: a cumulant expansion method for improving convergence.

    PubMed

    Causo, Maria Serena; Ciccotti, Giovanni; Bonella, Sara; Vuilleumier, Rodolphe

    2006-08-17

    Linearized mixed quantum-classical simulations are a promising approach for calculating time-correlation functions. At the moment, however, they suffer from some numerical problems that may compromise their efficiency and reliability in applications to realistic condensed-phase systems. In this paper, we present a method that improves upon the convergence properties of the standard algorithm for linearized calculations by implementing a cumulant expansion of the relevant averages. The effectiveness of the new approach is tested by applying it to the challenging computation of the diffusion of an excess electron in a metal-molten salt solution.

  14. A Poisson process approximation for generalized K-5 confidence regions

    NASA Technical Reports Server (NTRS)

    Arsham, H.; Miller, D. R.

    1982-01-01

    One-sided confidence regions for continuous cumulative distribution functions are constructed using empirical cumulative distribution functions and the generalized Kolmogorov-Smirnov distance. The band width of such regions becomes narrower in the right or left tail of the distribution. To avoid tedious computation of confidence levels and critical values, an approximation based on the Poisson process is introduced. This aproximation provides a conservative confidence region; moreover, the approximation error decreases monotonically to 0 as sample size increases. Critical values necessary for implementation are given. Applications are made to the areas of risk analysis, investment modeling, reliability assessment, and analysis of fault tolerant systems.

  15. The 3-Year Incidence of Gout in Elderly Patients with CKD.

    PubMed

    Tan, Vivian S; Garg, Amit X; McArthur, Eric; Lam, Ngan N; Sood, Manish M; Naylor, Kyla L

    2017-04-03

    The risk of gout across CKD stages is not well described. We performed a retrospective cohort study using linked health care databases from Ontario, Canada from 2002 to 2010. The primary outcome was the 3-year cumulative incidence of gout, on the basis of diagnostic codes. We presented our results by level of kidney function (eGFR≥90 ml/min per 1.73 m 2 , 60-89, 45-59, 30-44, 15-29, and chronic dialysis) and by sex. Additional analyses examined the risk of gout adjusting for clinical characteristics, incidence of gout defined by the receipt of allopurinol or colchicine, and gout risk in a subpopulation stratified by the level of eGFR and albuminuria. Of the 282,925 adults aged ≥66 years, the mean age was 75 years and 57.9% were women. The 3-year cumulative incidence of gout was higher in older adults with a lower level of eGFR. In women, the 3-year cumulative incidence of gout was 0.6%, 0.7%, 1.3%, 2.2%, and 3.4%, and in men the values were 0.8%, 1.2%, 2.5%, 3.7%, and 4.6%, respectively. However, patients on chronic dialysis had a lower 3-year cumulative incidence of gout (women 2.0%, men 2.9%) than those with more moderate reductions in kidney function ( i.e. , eGFR 15-44 ml/min per 1.73 m 2 ). The association between a greater loss of kidney function and a higher risk of diagnosed gout was also evident after adjustment for clinical characteristics and in all additional analyses. Patients with a lower level of eGFR had a higher 3-year cumulative incidence of gout, with the exception of patients receiving dialysis. Results can be used for risk stratification. Copyright © 2017 by the American Society of Nephrology.

  16. The 3-Year Incidence of Gout in Elderly Patients with CKD

    PubMed Central

    Tan, Vivian S.; Garg, Amit X.; McArthur, Eric; Lam, Ngan N.; Sood, Manish M.

    2017-01-01

    Background and objectives The risk of gout across CKD stages is not well described. Design, setting, participants, & measurements We performed a retrospective cohort study using linked health care databases from Ontario, Canada from 2002 to 2010. The primary outcome was the 3-year cumulative incidence of gout, on the basis of diagnostic codes. We presented our results by level of kidney function (eGFR≥90 ml/min per 1.73 m2, 60–89, 45–59, 30–44, 15–29, and chronic dialysis) and by sex. Additional analyses examined the risk of gout adjusting for clinical characteristics, incidence of gout defined by the receipt of allopurinol or colchicine, and gout risk in a subpopulation stratified by the level of eGFR and albuminuria. Results Of the 282,925 adults aged ≥66 years, the mean age was 75 years and 57.9% were women. The 3-year cumulative incidence of gout was higher in older adults with a lower level of eGFR. In women, the 3-year cumulative incidence of gout was 0.6%, 0.7%, 1.3%, 2.2%, and 3.4%, and in men the values were 0.8%, 1.2%, 2.5%, 3.7%, and 4.6%, respectively. However, patients on chronic dialysis had a lower 3-year cumulative incidence of gout (women 2.0%, men 2.9%) than those with more moderate reductions in kidney function (i.e., eGFR 15–44 ml/min per 1.73 m2). The association between a greater loss of kidney function and a higher risk of diagnosed gout was also evident after adjustment for clinical characteristics and in all additional analyses. Conclusions Patients with a lower level of eGFR had a higher 3-year cumulative incidence of gout, with the exception of patients receiving dialysis. Results can be used for risk stratification. PMID:28153936

  17. Lung function not affected by asbestos exposure in workers with normal Computed Tomography scan.

    PubMed

    Schikowsky, Christian; Felten, Michael K; Eisenhawer, Christian; Das, Marco; Kraus, Thomas

    2017-05-01

    It has been suggested that asbestos exposure affects lung function, even in the absence of asbestos-related pulmonary interstitial or pleural changes or emphysema. We analyzed associations between well-known asbestos-related risk factors, such as individual cumulative asbestos exposure, and key lung function parameters in formerly asbestos-exposed power industry workers (N = 207) with normal CT scans. For this, we excluded participants with emphysema, fibrosis, pleural changes, or any combination of these. The lung function parameters of FVC, FEV1, DLCO/VA, and airway resistance were significantly associated with the burden of smoking, BMI and years since end of exposure (only DLCO/VA). However, they were not affected by factors directly related to amount (eg, cumulative exposure) or duration of asbestos exposure. Our results confirm the well-known correlation between lung function, smoking habits, and BMI. However, we found no significant association between lung function and asbestos exposure. © 2017 Wiley Periodicals, Inc.

  18. The limits to cumulative causation: international migration from Mexican urban areas.

    PubMed

    Fussell, Elizabeth; Massey, Douglas S

    2004-02-01

    We present theoretical arguments and empirical research to suggest that the principal mechanisms of cumulative causation do not function in large urban settings. Using data from the Mexican Migration Project, we found evidence of cumulative causation in small cities, rural towns and villages, but not in large urban areas. With event-history models, we found little positive effect of community-level social capital and a strong deterrent effect of urban labor markets on the likelihood of first and later U.S. trips for residents of urban areas in Mexico, suggesting that the social process of migration from urban areas is distinct from that in the more widely studied rural migrant-sending communities of Mexico.

  19. Toward a cumulative ecological risk model for the etiology of child maltreatment

    PubMed Central

    MacKenzie, Michael J.; Kotch, Jonathan B.; Lee, Li-Ching

    2011-01-01

    The purpose of the current study was to further the integration of cumulative risk models with empirical research on the etiology of child maltreatment. Despite the well-established literature supporting the importance of the accumulation of ecological risk, this perspective has had difficulty infiltrating empirical maltreatment research and its tendency to focus on more limited risk factors. Utilizing a sample of 842 mother-infant dyads, we compared the capacity of individual risk factors and a cumulative index to predict maltreatment reports in a prospective longitudinal investigation over the first sixteen years of life. The total load of risk in early infancy was found to be related to maternal cognitions surrounding her new role, measures of social support and well-being, and indicators of child cognitive functioning. After controlling for total level of cumulative risk, most single factors failed to predict later maltreatment reports and no single variable provided odd-ratios as powerful as the predictive power of a cumulative index. Continuing the shift away from simplistic causal models toward an appreciation for the cumulative nature of risk would be an important step forward in the way we conceptualize intervention and support programs, concentrating them squarely on alleviating the substantial risk facing so many of society’s families. PMID:24817777

  20. Dispersion and line shape of plasmon satellites in one, two, and three dimensions

    DOE PAGES

    Vigil-Fowler, Derek; Louie, Steven G.; Lischner, Johannes

    2016-06-27

    Using state-of-the-art many-body Green's function calculations based on the GW plus cumulant approach, we analyze the properties of plasmon satellites in the electron spectral function resulting from electron-plasmon interactions in one-, two-, and three-dimensional systems. Specifically, we show how their dispersion relation, line shape, and linewidth are related to the properties of the constituent electrons and plasmons. In addition, to gain insight into the many-body processes giving rise to the formation of plasmon satellites, we connect the GW plus cumulant approach to a many-body wave-function picture of electron-plasmon interactions and introduce the coupling-strength-weighted electron-plasmon joint density states as a powerfulmore » concept for understanding plasmon satellites.« less

Top