Science.gov

Sample records for risk probability estimating

  1. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  2. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    ERIC Educational Resources Information Center

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  3. Estimating risk.

    PubMed

    2016-07-01

    A free mobile phone app has been launched providing nurses and other hospital clinicians with a simple way to identify high-risk surgical patients. The app is a phone version of the Surgical Outcome Risk Tool (SORT), originally developed for online use with computers by researchers from the National Confidential Enquiry into Patient Outcome and Death and the University College London Hospital Surgical Outcomes Research Centre. SORT uses information about patients' health and planned surgical procedures to estimate the risk of death within 30 days of an operation. The percentages are only estimates, taking into account the general risks of the procedures and some information about patients, and should not be confused with patient-specific estimates in individual cases. PMID:27369709

  4. Dynamic probability estimator for machine learning.

    PubMed

    Starzyk, Janusz A; Wang, Feng

    2004-03-01

    An efficient algorithm for dynamic estimation of probabilities without division on unlimited number of input data is presented. The method estimates probabilities of the sampled data from the raw sample count, while keeping the total count value constant. Accuracy of the estimate depends on the counter size, rather than on the total number of data points. Estimator follows variations of the incoming data probability within a fixed window size, without explicit implementation of the windowing technique. Total design area is very small and all probabilities are estimated concurrently. Dynamic probability estimator was implemented using a programmable gate array from Xilinx. The performance of this implementation is evaluated in terms of the area efficiency and execution time. This method is suitable for the highly integrated design of artificial neural networks where a large number of dynamic probability estimators can work concurrently. PMID:15384523

  5. Point estimates for probability moments

    PubMed Central

    Rosenblueth, Emilio

    1975-01-01

    Given a well-behaved real function Y of a real random variable X and the first two or three moments of X, expressions are derived for the moments of Y as linear combinations of powers of the point estimates y(x+) and y(x-), where x+ and x- are specific values of X. Higher-order approximations and approximations for discontinuous Y using more point estimates are also given. Second-moment approximations are generalized to the case when Y is a function of several variables. PMID:16578731

  6. Analyzing country risk: estimating the probability of external debt repudiation in the post-oil-embargo decade

    SciTech Connect

    Webster, T.J.

    1985-01-01

    The discussion is divided into two parts. Part one is devoted to a review of the topic of assessing the likelihood of debt servicing difficulties by borrower nations by first tracing the growth of international bank lending activities by US commercial banks, followed by a general discussion of the international debt crisis and a brief survey of some of the conventional approaches employed by many international institutions to assess overseas lending risk. Part one continues with a survey of a variety of social, economic, and political considerations incorporated into the risk evaluation process and concludes with a discussion of how these factors are integrated into the microeconomics of international bank lending. Part two of this study discusses specifically the use of logit analysis as a tool for evaluating country risk under alternative subset data specifications. The paper concludes with a discussion of the possible presence of dynamic elements in the rescheduling process which may ultimately help to improve upon the predictive performance of the logit model.

  7. Estimating flood exceedance probabilities in estuarine regions

    NASA Astrophysics Data System (ADS)

    Westra, Seth; Leonard, Michael

    2016-04-01

    Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).

  8. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. PMID:26010201

  9. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  10. My lived experiences are more important than your probabilities: The role of individualized risk estimates for decision making about participation in the Study of Tamoxifen and Raloxifene (STAR)

    PubMed Central

    Holmberg, Christine; Waters, Erika A.; Whitehouse, Katie; Daly, Mary; McCaskill-Stevens, Worta

    2015-01-01

    Background Decision making experts emphasize that understanding and using probabilistic information is important for making informed decisions about medical treatments involving complex risk-benefit tradeoffs. Yet empirical research demonstrates that individuals may not use probabilities when making decisions. Objectives To explore decision making and the use of probabilities for decision making from the perspective of women who were risk-eligible to enroll in the Study of Tamoxifen and Raloxifene (STAR). Methods We conducted narrative interviews with 20 women who agreed to participate in STAR and 20 women who declined. The project was based on a narrative approach. Analysis included the development of summaries of each narrative, and thematic analysis with developing a coding scheme inductively to code all transcripts to identify emerging themes. Results Interviewees explained and embedded their STAR decisions within experiences encountered throughout their lives. Such lived experiences included but were not limited to breast cancer family history, personal history of breast biopsies, and experiences or assumptions about taking tamoxifen or medicines more generally. Conclusions Women’s explanations of their decisions about participating in a breast cancer chemoprevention trial were more complex than decision strategies that rely solely on a quantitative risk-benefit analysis of probabilities derived from populations In addition to precise risk information, clinicians and risk communicators should recognize the importance and legitimacy of lived experience in individual decision making. PMID:26183166

  11. Model estimates hurricane wind speed probabilities

    NASA Astrophysics Data System (ADS)

    Mumane, Richard J.; Barton, Chris; Collins, Eric; Donnelly, Jeffrey; Eisner, James; Emanuel, Kerry; Ginis, Isaac; Howard, Susan; Landsea, Chris; Liu, Kam-biu; Malmquist, David; McKay, Megan; Michaels, Anthony; Nelson, Norm; O Brien, James; Scott, David; Webb, Thompson, III

    In the United States, intense hurricanes (category 3, 4, and 5 on the Saffir/Simpson scale) with winds greater than 50 m s -1 have caused more damage than any other natural disaster [Pielke and Pielke, 1997]. Accurate estimates of wind speed exceedance probabilities (WSEP) due to intense hurricanes are therefore of great interest to (re)insurers, emergency planners, government officials, and populations in vulnerable coastal areas.The historical record of U.S. hurricane landfall is relatively complete only from about 1900, and most model estimates of WSEP are derived from this record. During the 1899-1998 period, only two category-5 and 16 category-4 hurricanes made landfall in the United States. The historical record therefore provides only a limited sample of the most intense hurricanes.

  12. Distributed estimation and joint probabilities estimation by entropy model

    NASA Astrophysics Data System (ADS)

    Fassinut-Mombot, B.; Zribi, M.; Choquel, J. B.

    2001-05-01

    This paper proposes the use of Entropy Model for distributed estimation system. Entropy Model is an entropic technique based on the minimization of conditional entropy and developed for Multi-Source/Sensor Information Fusion (MSIF) problem. We address the problem of distributed estimation from independent observations involving multiple sources, i.e., the problem of estimating or selecting one of several identity declaration, or hypothesis concerning an observed object. Two problems are considered in Entropy Model. In order to fuse observations using Entropy Model, it is necessary to know or estimate the conditional probabilities and by equivalent the joint probabilities. A common practice for estimating probability distributions from data when nothing is known (without a priori knowledge), one should prefer distributions that are as uniform as possible, that is, have maximal entropy. Next, the problem of combining (or ``fusing'') observations relating to identity hypotheses and selecting the most appropriate hypothesis about the object's identity is addressed. Much future work remains, but the results indicate that Entropy Model is a promising technique for distributed estimation. .

  13. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  14. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  15. Radiation risk estimation models

    SciTech Connect

    Hoel, D.G.

    1987-11-01

    Cancer risk models and their relationship to ionizing radiation are discussed. There are many model assumptions and risk factors that have a large quantitative impact on the cancer risk estimates. Other health end points such as mental retardation may be an even more serious risk than cancer for those with in utero exposures. 8 references.

  16. Estimating the Probability of Negative Events

    ERIC Educational Resources Information Center

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  17. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  18. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  19. Fisher classifier and its probability of error estimation

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  20. Estimating the probability for major gene Alzheimer disease

    SciTech Connect

    Farrer, L.A. Boston Univ. School of Public Health, Boston, MA ); Cupples, L.A. )

    1994-02-01

    Alzheimer disease (AD) is a neuropsychiatric illness caused by multiple etiologies. Prediction of whether AD is genetically based in a given family is problematic because of censoring bias among unaffected relatives as a consequence of the late onset of the disorder, diagnostic uncertainties, heterogeneity, and limited information in a single family. The authors have developed a method based on Bayesian probability to compute values for a continuous variable that ranks AD families as having a major gene form of AD (MGAD). In addition, they have compared the Bayesian method with a maximum-likelihood approach. These methods incorporate sex- and age-adjusted risk estimates and allow for phenocopies and familial clustering of age on onset. Agreement is high between the two approaches for ranking families as MGAD (Spearman rank [r] = .92). When either method is used, the numerical outcomes are sensitive to assumptions of the gene frequency and cumulative incidence of the disease in the population. Consequently, risk estimates should be used cautiously for counseling purposes; however, there are numerous valid applications of these procedures in genetic and epidemiological studies. 41 refs., 4 figs., 3 tabs.

  1. Classification criteria and probability risk maps: limitations and perspectives.

    PubMed

    Saisana, Michaela; Dubois, Gregoire; Chaloulakou, Archontoula; Spyrellis, Nikolas

    2004-03-01

    Delineation of polluted zones with respect to regulatory standards, accounting at the same time for the uncertainty of the estimated concentrations, relies on classification criteria that can lead to significantly different pollution risk maps, which, in turn, can depend on the regulatory standard itself. This paper reviews four popular classification criteria related to the violation of a probability threshold or a physical threshold, using annual (1996-2000) nitrogen dioxide concentrations from 40 air monitoring stations in Milan. The relative advantages and practical limitations of each criterion are discussed, and it is shown that some of the criteria are more appropriate for the problem at hand and that the choice of the criterion can be supported by the statistical distribution of the data and/or the regulatory standard. Finally, the polluted area is estimated over the different years and concentration thresholds using the appropriate risk maps as an additional source of uncertainty. PMID:15046326

  2. Radiations in space: risk estimates.

    PubMed

    Fry, R J M

    2002-01-01

    The complexity of radiation environments in space makes estimation of risks more difficult than for the protection of terrestrial populations. In deep space the duration of the mission, position in the solar cycle, number and size of solar particle events (SPE) and the spacecraft shielding are the major determinants of risk. In low-earth orbit missions there are the added factors of altitude and orbital inclination. Different radiation qualities such as protons and heavy ions and secondary radiations inside the spacecraft such as neutrons of various energies, have to be considered. Radiation dose rates in space are low except for short periods during very large SPEs. Risk estimation for space activities is based on the human experience of exposure to gamma rays and to a lesser extent X rays. The doses of protons, heavy ions and neutrons are adjusted to take into account the relative biological effectiveness (RBE) of the different radiation types and thus derive equivalent doses. RBE values and factors to adjust for the effect of dose rate have to be obtained from experimental data. The influence of age and gender on the cancer risk is estimated from the data from atomic bomb survivors. Because of the large number of variables the uncertainities in the probability of the effects are large. Information needed to improve the risk estimates includes: (1) risk of cancer induction by protons, heavy ions and neutrons: (2) influence of dose rate and protraction, particularly on potential tissue effects such as reduced fertility and cataracts: and (3) possible effects of heavy ions on the central nervous system. Risk cannot be eliminated and thus there must be a consensus on what level of risk is acceptable. PMID:12382925

  3. A Bayesian Estimator of Protein-Protein Association Probabilities

    SciTech Connect

    Gilmore, Jason M.; Auberry, Deanna L.; Sharp, Julia L.; White, Amanda M.; Anderson, Kevin K.; Daly, Don S.

    2008-07-01

    The Bayesian Estimator of Protein-Protein Association Probabilities (BEPro3) is a software tool for estimating probabilities of protein-protein association between bait and prey protein pairs using data from multiple-bait, multiple-replicate, protein pull-down LC-MS assay experiments. BEPro3 is open source software that runs on both Windows XP and Mac OS 10.4 or newer versions, and is freely available from http://www.pnl.gov/statistics/BEPro3.

  4. Injury Risk Estimation Expertise

    PubMed Central

    Petushek, Erich J.; Ward, Paul; Cokely, Edward T.; Myer, Gregory D.

    2015-01-01

    Background: Simple observational assessment of movement is a potentially low-cost method for anterior cruciate ligament (ACL) injury screening and prevention. Although many individuals utilize some form of observational assessment of movement, there are currently no substantial data on group skill differences in observational screening of ACL injury risk. Purpose/Hypothesis: The purpose of this study was to compare various groups’ abilities to visually assess ACL injury risk as well as the associated strategies and ACL knowledge levels. The hypothesis was that sports medicine professionals would perform better than coaches and exercise science academics/students and that these subgroups would all perform better than parents and other general population members. Study Design: Cross-sectional study; Level of evidence, 3. Methods: A total of 428 individuals, including physicians, physical therapists, athletic trainers, strength and conditioning coaches, exercise science researchers/students, athletes, parents, and members of the general public participated in the study. Participants completed the ACL Injury Risk Estimation Quiz (ACL-IQ) and answered questions related to assessment strategy and ACL knowledge. Results: Strength and conditioning coaches, athletic trainers, physical therapists, and exercise science students exhibited consistently superior ACL injury risk estimation ability (+2 SD) as compared with sport coaches, parents of athletes, and members of the general public. The performance of a substantial number of individuals in the exercise sciences/sports medicines (approximately 40%) was similar to or exceeded clinical instrument-based biomechanical assessment methods (eg, ACL nomogram). Parents, sport coaches, and the general public had lower ACL-IQ, likely due to their lower ACL knowledge and to rating the importance of knee/thigh motion lower and weight and jump height higher. Conclusion: Substantial cross-professional/group differences in visual ACL

  5. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. PMID:25363706

  6. Probability estimation in arithmetic and adaptive-Huffman entropy coders.

    PubMed

    Duttweiler, D L; Chamzas, C

    1995-01-01

    Entropy coders, such as Huffman and arithmetic coders, achieve compression by exploiting nonuniformity in the probabilities under which a random variable to be coded takes on its possible values. Practical realizations generally require running adaptive estimates of these probabilities. An analysis of the relationship between estimation quality and the resulting coding efficiency suggests a particular scheme, dubbed scaled-count, for obtaining such estimates. It can optimally balance estimation accuracy against a need for rapid response to changing underlying statistics. When the symbols being coded are from a binary alphabet, simple hardware and software implementations requiring almost no computation are possible. A scaled-count adaptive probability estimator of the type described in this paper is used in the arithmetic coder of the JBIG and JPEG image coding standards. PMID:18289975

  7. Low-probability flood risk modeling for New York City.

    PubMed

    Aerts, Jeroen C J H; Lin, Ning; Botzen, Wouter; Emanuel, Kerry; de Moel, Hans

    2013-05-01

    The devastating impact by Hurricane Sandy (2012) again showed New York City (NYC) is one of the most vulnerable cities to coastal flooding around the globe. The low-lying areas in NYC can be flooded by nor'easter storms and North Atlantic hurricanes. The few studies that have estimated potential flood damage for NYC base their damage estimates on only a single, or a few, possible flood events. The objective of this study is to assess the full distribution of hurricane flood risk in NYC. This is done by calculating potential flood damage with a flood damage model that uses many possible storms and surge heights as input. These storms are representative for the low-probability/high-impact flood hazard faced by the city. Exceedance probability-loss curves are constructed under different assumptions about the severity of flood damage. The estimated flood damage to buildings for NYC is between US$59 and 129 millions/year. The damage caused by a 1/100-year storm surge is within a range of US$2 bn-5 bn, while this is between US$5 bn and 11 bn for a 1/500-year storm surge. An analysis of flood risk in each of the five boroughs of NYC finds that Brooklyn and Queens are the most vulnerable to flooding. This study examines several uncertainties in the various steps of the risk analysis, which resulted in variations in flood damage estimations. These uncertainties include: the interpolation of flood depths; the use of different flood damage curves; and the influence of the spectra of characteristics of the simulated hurricanes. PMID:23383711

  8. Interval estimation of small tail probabilities - applications in food safety.

    PubMed

    Kedem, Benjamin; Pan, Lemeng; Zhou, Wen; Coelho, Carlos A

    2016-08-15

    Often in food safety and bio-surveillance it is desirable to estimate the probability that a contaminant or a function thereof exceeds an unsafe high threshold. The probability or chance in question is very small. To estimate such a probability, we need information about large values. In many cases, the data do not contain information about exceedingly large contamination levels, which ostensibly renders the problem insolvable. A solution is suggested whereby more information about small tail probabilities are obtained by combining the real data with computer-generated data repeatedly. This method provides short yet reliable interval estimates based on moderately large samples. An illustration is provided in terms of lead exposure data. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26891189

  9. 27% Probable: Estimating Whether or Not Large Numbers Are Prime.

    ERIC Educational Resources Information Center

    Bosse, Michael J.

    2001-01-01

    This brief investigation exemplifies such considerations by relating concepts from number theory, set theory, probability, logic, and calculus. Satisfying the call for students to acquire skills in estimation, the following technique allows one to "immediately estimate" whether or not a number is prime. (MM)

  10. Estimation of State Transition Probabilities: A Neural Network Model

    NASA Astrophysics Data System (ADS)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  11. Estimating the empirical probability of submarine landslide occurrence

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  12. Bayesian Estimator of Protein-Protein Association Probabilities

    Energy Science and Technology Software Center (ESTSC)

    2008-05-28

    The Bayesian Estimator of Protein-Protein Association Probabilities (BEPro3) is a software tool for estimating probabilities of protein-protein association between bait and prey protein pairs using data from multiple-bait, multiple-replicate, protein LC-MS/MS affinity isolation experiments. BEPro3 is public domain software, has been tested on Windows XP and version 10.4 or newer of the Mac OS 10.4, and is freely available. A user guide, example dataset with analysis and additional documentation are included with the BEPro3 download.

  13. An application of recurrent nets to phone probability estimation.

    PubMed

    Robinson, A J

    1994-01-01

    This paper presents an application of recurrent networks for phone probability estimation in large vocabulary speech recognition. The need for efficient exploitation of context information is discussed; a role for which the recurrent net appears suitable. An overview of early developments of recurrent nets for phone recognition is given along with the more recent improvements that include their integration with Markov models. Recognition results are presented for the DARPA TIMIT and Resource Management tasks, and it is concluded that recurrent nets are competitive with traditional means for performing phone probability estimation. PMID:18267798

  14. Probability Estimation of CO2 Leakage Through Faults at Geologic Carbon Sequestration Sites

    SciTech Connect

    Zhang, Yingqi; Oldenburg, Curt; Finsterle, Stefan; Jordan, Preston; Zhang, Keni

    2008-11-01

    Leakage of CO{sub 2} and brine along faults at geologic carbon sequestration (GCS) sites is a primary concern for storage integrity. The focus of this study is on the estimation of the probability of leakage along faults or fractures. This leakage probability is controlled by the probability of a connected network of conduits existing at a given site, the probability of this network encountering the CO{sub 2} plume, and the probability of this network intersecting environmental resources that may be impacted by leakage. This work is designed to fit into a risk assessment and certification framework that uses compartments to represent vulnerable resources such as potable groundwater, health and safety, and the near-surface environment. The method we propose includes using percolation theory to estimate the connectivity of the faults, and generating fuzzy rules from discrete fracture network simulations to estimate leakage probability. By this approach, the probability of CO{sub 2} escaping into a compartment for a given system can be inferred from the fuzzy rules. The proposed method provides a quick way of estimating the probability of CO{sub 2} or brine leaking into a compartment. In addition, it provides the uncertainty range of the estimated probability.

  15. Estimating Second Order Probability Beliefs from Subjective Survival Data

    PubMed Central

    Hudomiet, Péter; Willis, Robert J.

    2013-01-01

    Based on subjective survival probability questions in the Health and Retirement Study (HRS), we use an econometric model to estimate the determinants of individual-level uncertainty about personal longevity. This model is built around the modal response hypothesis (MRH), a mathematical expression of the idea that survey responses of 0%, 50%, or 100% to probability questions indicate a high level of uncertainty about the relevant probability. We show that subjective survival expectations in 2002 line up very well with realized mortality of the HRS respondents between 2002 and 2010. We show that the MRH model performs better than typically used models in the literature of subjective probabilities. Our model gives more accurate estimates of low probability events and it is able to predict the unusually high fraction of focal 0%, 50%, and 100% answers observed in many data sets on subjective probabilities. We show that subjects place too much weight on parents’ age at death when forming expectations about their own longevity, whereas other covariates such as demographics, cognition, personality, subjective health, and health behavior are under weighted. We also find that less educated people, smokers, and women have less certain beliefs, and recent health shocks increase uncertainty about survival, too. PMID:24403866

  16. Simulation and Estimation of Extreme Quantiles and Extreme Probabilities

    SciTech Connect

    Guyader, Arnaud; Hengartner, Nicolas; Matzner-Lober, Eric

    2011-10-15

    Let X be a random vector with distribution {mu} on Double-Struck-Capital-R {sup d} and {Phi} be a mapping from Double-Struck-Capital-R {sup d} to Double-Struck-Capital-R . That mapping acts as a black box, e.g., the result from some computer experiments for which no analytical expression is available. This paper presents an efficient algorithm to estimate a tail probability given a quantile or a quantile given a tail probability. The algorithm improves upon existing multilevel splitting methods and can be analyzed using Poisson process tools that lead to exact description of the distribution of the estimated probabilities and quantiles. The performance of the algorithm is demonstrated in a problem related to digital watermarking.

  17. Using Correlation to Compute Better Probability Estimates in Plan Graphs

    NASA Technical Reports Server (NTRS)

    Bryce, Daniel; Smith, David E.

    2006-01-01

    Plan graphs are commonly used in planning to help compute heuristic "distance" estimates between states and goals. A few authors have also attempted to use plan graphs in probabilistic planning to compute estimates of the probability that propositions can be achieved and actions can be performed. This is done by propagating probability information forward through the plan graph from the initial conditions through each possible action to the action effects, and hence to the propositions at the next layer of the plan graph. The problem with these calculations is that they make very strong independence assumptions - in particular, they usually assume that the preconditions for each action are independent of each other. This can lead to gross overestimates in probability when the plans for those preconditions interfere with each other. It can also lead to gross underestimates of probability when there is synergy between the plans for two or more preconditions. In this paper we introduce a notion of the binary correlation between two propositions and actions within a plan graph, show how to propagate this information within a plan graph, and show how this improves probability estimates for planning. This notion of correlation can be thought of as a continuous generalization of the notion of mutual exclusion (mutex) often used in plan graphs. At one extreme (correlation=0) two propositions or actions are completely mutex. With correlation = 1, two propositions or actions are independent, and with correlation > 1, two propositions or actions are synergistic. Intermediate values can and do occur indicating different degrees to which propositions and action interfere or are synergistic. We compare this approach with another recent approach by Bryce that computes probability estimates using Monte Carlo simulation of possible worlds in plan graphs.

  18. Failure probability estimate of Type 304 stainless steel piping

    SciTech Connect

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.; General Electric Co., San Jose, CA )

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC, (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination, (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage, and (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break.

  19. Revising probability estimates: Why increasing likelihood means increasing impact.

    PubMed

    Maglio, Sam J; Polman, Evan

    2016-08-01

    Forecasted probabilities rarely stay the same for long. Instead, they are subject to constant revision-moving upward or downward, uncertain events become more or less likely. Yet little is known about how people interpret probability estimates beyond static snapshots, like a 30% chance of rain. Here, we consider the cognitive, affective, and behavioral consequences of revisions to probability forecasts. Stemming from a lay belief that revisions signal the emergence of a trend, we find in 10 studies (comprising uncertain events such as weather, climate change, sex, sports, and wine) that upward changes to event-probability (e.g., increasing from 20% to 30%) cause events to feel less remote than downward changes (e.g., decreasing from 40% to 30%), and subsequently change people's behavior regarding those events despite the revised event-probabilities being the same. Our research sheds light on how revising the probabilities for future events changes how people manage those uncertain events. (PsycINFO Database Record PMID:27281350

  20. Estimating transition probabilities in unmarked populations --entropy revisited

    USGS Publications Warehouse

    Cooch, E.G.; Link, W.A.

    1999-01-01

    The probability of surviving and moving between 'states' is of great interest to biologists. Robust estimation of these transitions using multiple observations of individually identifiable marked individuals has received considerable attention in recent years. However, in some situations, individuals are not identifiable (or have a very low recapture rate), although all individuals in a sample can be assigned to a particular state (e.g. breeding or non-breeding) without error. In such cases, only aggregate data (number of individuals in a given state at each occasion) are available. If the underlying matrix of transition probabilities does not vary through time and aggregate data are available for several time periods, then it is possible to estimate these parameters using least-squares methods. Even when such data are available, this assumption of stationarity will usually be deemed overly restrictive and, frequently, data will only be available for two time periods. In these cases, the problem reduces to estimating the most likely matrix (or matrices) leading to the observed frequency distribution of individuals in each state. An entropy maximization approach has been previously suggested. In this paper, we show that the entropy approach rests on a particular limiting assumption, and does not provide estimates of latent population parameters (the transition probabilities), but rather predictions of realized rates.

  1. On estimating the fracture probability of nuclear graphite components

    NASA Astrophysics Data System (ADS)

    Srinivasan, Makuteswara

    2008-10-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation.

  2. Risks and probabilities of breast cancer: short-term versus lifetime probabilities.

    PubMed Central

    Bryant, H E; Brasher, P M

    1994-01-01

    OBJECTIVE: To calculate age-specific short-term and lifetime probabilities of breast cancer among a cohort of Canadian women. DESIGN: Double decrement life table. SETTING: Alberta. SUBJECTS: Women with first invasive breast cancers registered with the Alberta Cancer Registry between 1985 and 1987. MAIN OUTCOME MEASURES: Lifetime probability of breast cancer from birth and for women at various ages; short-term (up to 10 years) probability of breast cancer for women at various ages. RESULTS: The lifetime probability of breast cancer is 10.17% at birth and peaks at 10.34% at age 25 years, after which it decreases owing to a decline in the number of years over which breast cancer risk will be experienced. However, the probability of manifesting breast cancer in the next year increases steadily from the age of 30 onward, reaching 0.36% at 85 years. The probability of manifesting the disease within the next 10 years peaks at 2.97% at age 70 and decreases thereafter, again owing to declining probabilities of surviving the interval. CONCLUSIONS: Given that the incidence of breast cancer among Albertan women during the study period was similar to the national average, we conclude that currently more than 1 in 10 women in Canada can expect to have breast cancer at some point during their life. However, risk varies considerably over a woman's lifetime, with most risk concentrated after age 49. On the basis of the shorter-term age-specific risks that we present, the clinician can put breast cancer risk into perspective for younger women and heighten awareness among women aged 50 years or more. PMID:8287343

  3. Estimating probable flaw distributions in PWR steam generator tubes

    SciTech Connect

    Gorman, J.A.; Turner, A.P.L.

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  4. Estimating the exceedance probability of rain rate by logistic regression

    NASA Technical Reports Server (NTRS)

    Chiu, Long S.; Kedem, Benjamin

    1990-01-01

    Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.

  5. Risk Preferences, Probability Weighting, and Strategy Tradeoffs in Wildfire Management.

    PubMed

    Hand, Michael S; Wibbenmeyer, Matthew J; Calkin, David E; Thompson, Matthew P

    2015-10-01

    Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to federal wildfire managers. Respondents were presented with a multiattribute lottery-choice experiment where each lottery is defined by three outcome attributes: expenditures for fire suppression, damage to private property, and exposure of firefighters to the risk of aviation-related fatalities. Respondents choose one of two strategies, each of which includes "good" (low cost/low damage) and "bad" (high cost/high damage) outcomes that occur with varying probabilities. The choice task also incorporates an information framing experiment to test whether information about fatality risk to firefighters alters managers' responses to risk. Results suggest that managers exhibit risk aversion and nonlinear probability weighting, which can result in choices that do not minimize expected expenditures, property damage, or firefighter exposure. Information framing tends to result in choices that reduce the risk of aviation fatalities, but exacerbates nonlinear probability weighting. PMID:26269258

  6. Probability Density and CFAR Threshold Estimation for Hyperspectral Imaging

    SciTech Connect

    Clark, G A

    2004-09-21

    The work reported here shows the proof of principle (using a small data set) for a suite of algorithms designed to estimate the probability density function of hyperspectral background data and compute the appropriate Constant False Alarm Rate (CFAR) matched filter decision threshold for a chemical plume detector. Future work will provide a thorough demonstration of the algorithms and their performance with a large data set. The LASI (Large Aperture Search Initiative) Project involves instrumentation and image processing for hyperspectral images of chemical plumes in the atmosphere. The work reported here involves research and development on algorithms for reducing the false alarm rate in chemical plume detection and identification algorithms operating on hyperspectral image cubes. The chemical plume detection algorithms to date have used matched filters designed using generalized maximum likelihood ratio hypothesis testing algorithms [1, 2, 5, 6, 7, 12, 10, 11, 13]. One of the key challenges in hyperspectral imaging research is the high false alarm rate that often results from the plume detector [1, 2]. The overall goal of this work is to extend the classical matched filter detector to apply Constant False Alarm Rate (CFAR) methods to reduce the false alarm rate, or Probability of False Alarm P{sub FA} of the matched filter [4, 8, 9, 12]. A detector designer is interested in minimizing the probability of false alarm while simultaneously maximizing the probability of detection P{sub D}. This is summarized by the Receiver Operating Characteristic Curve (ROC) [10, 11], which is actually a family of curves depicting P{sub D} vs. P{sub FA}parameterized by varying levels of signal to noise (or clutter) ratio (SNR or SCR). Often, it is advantageous to be able to specify a desired P{sub FA} and develop a ROC curve (P{sub D} vs. decision threshold r{sub 0}) for that case. That is the purpose of this work. Specifically, this work develops a set of algorithms and MATLAB

  7. Estimation of the probability of success in petroleum exploration

    USGS Publications Warehouse

    Davis, J.C.

    1977-01-01

    A probabilistic model for oil exploration can be developed by assessing the conditional relationship between perceived geologic variables and the subsequent discovery of petroleum. Such a model includes two probabilistic components, the first reflecting the association between a geologic condition (structural closure, for example) and the occurrence of oil, and the second reflecting the uncertainty associated with the estimation of geologic variables in areas of limited control. Estimates of the conditional relationship between geologic variables and subsequent production can be found by analyzing the exploration history of a "training area" judged to be geologically similar to the exploration area. The geologic variables are assessed over the training area using an historical subset of the available data, whose density corresponds to the present control density in the exploration area. The success or failure of wells drilled in the training area subsequent to the time corresponding to the historical subset provides empirical estimates of the probability of success conditional upon geology. Uncertainty in perception of geological conditions may be estimated from the distribution of errors made in geologic assessment using the historical subset of control wells. These errors may be expressed as a linear function of distance from available control. Alternatively, the uncertainty may be found by calculating the semivariogram of the geologic variables used in the analysis: the two procedures will yield approximately equivalent results. The empirical probability functions may then be transferred to the exploration area and used to estimate the likelihood of success of specific exploration plays. These estimates will reflect both the conditional relationship between the geological variables used to guide exploration and the uncertainty resulting from lack of control. The technique is illustrated with case histories from the mid-Continent area of the U.S.A. ?? 1977 Plenum

  8. Estimating transition probabilities among everglades wetland communities using multistate models

    USGS Publications Warehouse

    Hotaling, A.S.; Martin, J.; Kitchens, W.M.

    2009-01-01

    In this study we were able to provide the first estimates of transition probabilities of wet prairie and slough vegetative communities in Water Conservation Area 3A (WCA3A) of the Florida Everglades and to identify the hydrologic variables that determine these transitions. These estimates can be used in management models aimed at restoring proportions of wet prairie and slough habitats to historical levels in the Everglades. To determine what was driving the transitions between wet prairie and slough communities we evaluated three hypotheses: seasonality, impoundment, and wet and dry year cycles using likelihood-based multistate models to determine the main driver of wet prairie conversion in WCA3A. The most parsimonious model included the effect of wet and dry year cycles on vegetative community conversions. Several ecologists have noted wet prairie conversion in southern WCA3A but these are the first estimates of transition probabilities among these community types. In addition, to being useful for management of the Everglades we believe that our framework can be used to address management questions in other ecosystems. ?? 2009 The Society of Wetland Scientists.

  9. Image-based camera motion estimation using prior probabilities

    NASA Astrophysics Data System (ADS)

    Sargent, Dusty; Park, Sun Young; Spofford, Inbar; Vosburgh, Kirby

    2011-03-01

    Image-based camera motion estimation from video or still images is a difficult problem in the field of computer vision. Many algorithms have been proposed for estimating intrinsic camera parameters, detecting and matching features between images, calculating extrinsic camera parameters based on those features, and optimizing the recovered parameters with nonlinear methods. These steps in the camera motion inference process all face challenges in practical applications: locating distinctive features can be difficult in many types of scenes given the limited capabilities of current feature detectors, camera motion inference can easily fail in the presence of noise and outliers in the matched features, and the error surfaces in optimization typically contain many suboptimal local minima. The problems faced by these techniques are compounded when they are applied to medical video captured by an endoscope, which presents further challenges such as non-rigid scenery and severe barrel distortion of the images. In this paper, we study these problems and propose the use of prior probabilities to stabilize camera motion estimation for the application of computing endoscope motion sequences in colonoscopy. Colonoscopy presents a special case for camera motion estimation in which it is possible to characterize typical motion sequences of the endoscope. As the endoscope is restricted to move within a roughly tube-shaped structure, forward/backward motion is expected, with only small amounts of rotation and horizontal movement. We formulate a probabilistic model of endoscope motion by maneuvering an endoscope and attached magnetic tracker through a synthetic colon model and fitting a distribution to the observed motion of the magnetic tracker. This model enables us to estimate the probability of the current endoscope motion given previously observed motion in the sequence. We add these prior probabilities into the camera motion calculation as an additional penalty term in RANSAC

  10. Probabilities and statistics for backscatter estimates obtained by a scatterometer

    NASA Technical Reports Server (NTRS)

    Pierson, Willard J., Jr.

    1989-01-01

    Methods for the recovery of winds near the surface of the ocean from measurements of the normalized radar backscattering cross section must recognize and make use of the statistics (i.e., the sampling variability) of the backscatter measurements. Radar backscatter values from a scatterometer are random variables with expected values given by a model. A model relates backscatter to properties of the waves on the ocean, which are in turn generated by the winds in the atmospheric marine boundary layer. The effective wind speed and direction at a known height for a neutrally stratified atmosphere are the values to be recovered from the model. The probability density function for the backscatter values is a normal probability distribution with the notable feature that the variance is a known function of the expected value. The sources of signal variability, the effects of this variability on the wind speed estimation, and criteria for the acceptance or rejection of models are discussed. A modified maximum likelihood method for estimating wind vectors is described. Ways to make corrections for the kinds of errors found for the Seasat SASS model function are described, and applications to a new scatterometer are given.

  11. Failure probability estimate of Type 304 stainless steel piping

    SciTech Connect

    Daugherty, W L; Awadalla, N G; Sindelar, R L; Mehta, H S; Ranganath, S; General Electric Co., San Jose, CA )

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusion of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break. 5 refs., 2 figs.

  12. Thinking Concretely Increases the Perceived Likelihood of Risks: The Effect of Construal Level on Risk Estimation.

    PubMed

    Lermer, Eva; Streicher, Bernhard; Sachs, Rainer; Raue, Martina; Frey, Dieter

    2016-03-01

    Recent findings on construal level theory (CLT) suggest that abstract thinking leads to a lower estimated probability of an event occurring compared to concrete thinking. We applied this idea to the risk context and explored the influence of construal level (CL) on the overestimation of small and underestimation of large probabilities for risk estimates concerning a vague target person (Study 1 and Study 3) and personal risk estimates (Study 2). We were specifically interested in whether the often-found overestimation of small probabilities could be reduced with abstract thinking, and the often-found underestimation of large probabilities was reduced with concrete thinking. The results showed that CL influenced risk estimates. In particular, a concrete mindset led to higher risk estimates compared to an abstract mindset for several adverse events, including events with small and large probabilities. This suggests that CL manipulation can indeed be used for improving the accuracy of lay people's estimates of small and large probabilities. Moreover, the results suggest that professional risk managers' risk estimates of common events (thus with a relatively high probability) could be improved by adopting a concrete mindset. However, the abstract manipulation did not lead managers to estimate extremely unlikely events more accurately. Potential reasons for different CL manipulation effects on risk estimates' accuracy between lay people and risk managers are discussed. PMID:26111548

  13. Hitchhikers on trade routes: A phenology model estimates the probabilities of gypsy moth introduction and establishment.

    PubMed

    Gray, David R

    2010-12-01

    As global trade increases so too does the probability of introduction of alien species to new locations. Estimating the probability of an alien species introduction and establishment following introduction is a necessary step in risk estimation (probability of an event times the consequences, in the currency of choice, of the event should it occur); risk estimation is a valuable tool for reducing the risk of biological invasion with limited resources. The Asian gypsy moth, Lymantria dispar (L.), is a pest species whose consequence of introduction and establishment in North America and New Zealand warrants over US$2 million per year in surveillance expenditure. This work describes the development of a two-dimensional phenology model (GLS-2d) that simulates insect development from source to destination and estimates: (1) the probability of introduction from the proportion of the source population that would achieve the next developmental stage at the destination and (2) the probability of establishment from the proportion of the introduced population that survives until a stable life cycle is reached at the destination. The effect of shipping schedule on the probabilities of introduction and establishment was examined by varying the departure date from 1 January to 25 December by weekly increments. The effect of port efficiency was examined by varying the length of time that invasion vectors (shipping containers and ship) were available for infection. The application of GLS-2d is demonstrated using three common marine trade routes (to Auckland, New Zealand, from Kobe, Japan, and to Vancouver, Canada, from Kobe and from Vladivostok, Russia). PMID:21265459

  14. Site Specific Probable Maximum Precipitation Estimates and Professional Judgement

    NASA Astrophysics Data System (ADS)

    Hayes, B. D.; Kao, S. C.; Kanney, J. F.; Quinlan, K. R.; DeNeale, S. T.

    2015-12-01

    State and federal regulatory authorities currently rely upon the US National Weather Service Hydrometeorological Reports (HMRs) to determine probable maximum precipitation (PMP) estimates (i.e., rainfall depths and durations) for estimating flooding hazards for relatively broad regions in the US. PMP estimates for the contributing watersheds upstream of vulnerable facilities are used to estimate riverine flooding hazards while site-specific estimates for small water sheds are appropriate for individual facilities such as nuclear power plants. The HMRs are often criticized due to their limitations on basin size, questionable applicability in regions affected by orographic effects, their lack of consist methods, and generally by their age. HMR-51 for generalized PMP estimates for the United States east of the 105th meridian, was published in 1978 and is sometimes perceived as overly conservative. The US Nuclear Regulatory Commission (NRC), is currently reviewing several flood hazard evaluation reports that rely on site specific PMP estimates that have been commercially developed. As such, NRC has recently investigated key areas of expert judgement via a generic audit and one in-depth site specific review as they relate to identifying and quantifying actual and potential storm moisture sources, determining storm transposition limits, and adjusting available moisture during storm transposition. Though much of the approach reviewed was considered a logical extension of HMRs, two key points of expert judgement stood out for further in-depth review. The first relates primarily to small storms and the use of a heuristic for storm representative dew point adjustment developed for the Electric Power Research Institute by North American Weather Consultants in 1993 in order to harmonize historic storms for which only 12 hour dew point data was available with more recent storms in a single database. The second issue relates to the use of climatological averages for spatially

  15. Exaggerated Risk: Prospect Theory and Probability Weighting in Risky Choice

    ERIC Educational Resources Information Center

    Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick

    2009-01-01

    In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and…

  16. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    NASA Technical Reports Server (NTRS)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  17. Development of an integrated system for estimating human error probabilities

    SciTech Connect

    Auflick, J.L.; Hahn, H.A.; Morzinski, J.A.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This project had as its main objective the development of a Human Reliability Analysis (HRA), knowledge-based expert system that would provide probabilistic estimates for potential human errors within various risk assessments, safety analysis reports, and hazard assessments. HRA identifies where human errors are most likely, estimates the error rate for individual tasks, and highlights the most beneficial areas for system improvements. This project accomplished three major tasks. First, several prominent HRA techniques and associated databases were collected and translated into an electronic format. Next, the project started a knowledge engineering phase where the expertise, i.e., the procedural rules and data, were extracted from those techniques and compiled into various modules. Finally, these modules, rules, and data were combined into a nearly complete HRA expert system.

  18. Reinforcing flood-risk estimation.

    PubMed

    Reed, Duncan W

    2002-07-15

    Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin? PMID:12804255

  19. Estimation and prediction of maximum daily rainfall at Sagar Island using best fit probability models

    NASA Astrophysics Data System (ADS)

    Mandal, S.; Choudhury, B. U.

    2015-07-01

    Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.

  20. Estimating Terrorist Risk with Possibility Theory

    SciTech Connect

    J.L. Darby

    2004-11-30

    This report summarizes techniques that use possibility theory to estimate the risk of terrorist acts. These techniques were developed under the sponsorship of the Department of Homeland Security (DHS) as part of the National Infrastructure Simulation Analysis Center (NISAC) project. The techniques have been used to estimate the risk of various terrorist scenarios to support NISAC analyses during 2004. The techniques are based on the Logic Evolved Decision (LED) methodology developed over the past few years by Terry Bott and Steve Eisenhawer at LANL. [LED] The LED methodology involves the use of fuzzy sets, possibility theory, and approximate reasoning. LED captures the uncertainty due to vagueness and imprecision that is inherent in the fidelity of the information available for terrorist acts; probability theory cannot capture these uncertainties. This report does not address the philosophy supporting the development of nonprobabilistic approaches, and it does not discuss possibility theory in detail. The references provide a detailed discussion of these subjects. [Shafer] [Klir and Yuan] [Dubois and Prade] Suffice to say that these approaches were developed to address types of uncertainty that cannot be addressed by a probability measure. An earlier report discussed in detail the problems with using a probability measure to evaluate terrorist risk. [Darby Methodology]. Two related techniques are discussed in this report: (1) a numerical technique, and (2) a linguistic technique. The numerical technique uses traditional possibility theory applied to crisp sets, while the linguistic technique applies possibility theory to fuzzy sets. Both of these techniques as applied to terrorist risk for NISAC applications are implemented in software called PossibleRisk. The techniques implemented in PossibleRisk were developed specifically for use in estimating terrorist risk for the NISAC program. The LEDTools code can be used to perform the same linguistic evaluation as

  1. Semi-supervised dimensionality reduction using estimated class membership probabilities

    NASA Astrophysics Data System (ADS)

    Li, Wei; Ruan, Qiuqi; Wan, Jun

    2012-10-01

    In solving pattern-recognition tasks with partially labeled training data, the semi-supervised dimensionality reduction method, which considers both labeled and unlabeled data, is preferable for improving the classification and generalization capability of the testing data. Among such techniques, graph-based semi-supervised learning methods have attracted a lot of attention due to their appealing properties in discovering discriminative structure and geometric structure of data points. Although they have achieved remarkable success, they cannot promise good performance when the size of the labeled data set is small, as a result of inaccurate class matrix variance approximated by insufficient labeled training data. In this paper, we tackle this problem by combining class membership probabilities estimated from unlabeled data and ground-truth class information associated with labeled data to more precisely characterize the class distribution. Therefore, it is expected to enhance performance in classification tasks. We refer to this approach as probabilistic semi-supervised discriminant analysis (PSDA). The proposed PSDA is applied to face and facial expression recognition tasks and is evaluated using the ORL, Extended Yale B, and CMU PIE face databases and the Cohn-Kanade facial expression database. The promising experimental results demonstrate the effectiveness of our proposed method.

  2. Structural health monitoring and probability of detection estimation

    NASA Astrophysics Data System (ADS)

    Forsyth, David S.

    2016-02-01

    Structural health monitoring (SHM) methods are often based on nondestructive testing (NDT) sensors and are often proposed as replacements for NDT to lower cost and/or improve reliability. In order to take advantage of SHM for life cycle management, it is necessary to determine the Probability of Detection (POD) of the SHM system just as for traditional NDT to ensure that the required level of safety is maintained. Many different possibilities exist for SHM systems, but one of the attractive features of SHM versus NDT is the ability to take measurements very simply after the SHM system is installed. Using a simple statistical model of POD, some authors have proposed that very high rates of SHM system data sampling can result in high effective POD even in situations where an individual test has low POD. In this paper, we discuss the theoretical basis for determining the effect of repeated inspections, and examine data from SHM experiments against this framework to show how the effective POD from multiple tests can be estimated.

  3. Estimation of capture probabilities using generalized estimating equations and mixed effects approaches

    PubMed Central

    Akanda, Md Abdus Salam; Alpizar-Jara, Russell

    2014-01-01

    Modeling individual heterogeneity in capture probabilities has been one of the most challenging tasks in capture–recapture studies. Heterogeneity in capture probabilities can be modeled as a function of individual covariates, but correlation structure among capture occasions should be taking into account. A proposed generalized estimating equations (GEE) and generalized linear mixed modeling (GLMM) approaches can be used to estimate capture probabilities and population size for capture–recapture closed population models. An example is used for an illustrative application and for comparison with currently used methodology. A simulation study is also conducted to show the performance of the estimation procedures. Our simulation results show that the proposed quasi-likelihood based on GEE approach provides lower SE than partial likelihood based on either generalized linear models (GLM) or GLMM approaches for estimating population size in a closed capture–recapture experiment. Estimator performance is good if a large proportion of individuals are captured. For cases where only a small proportion of individuals are captured, the estimates become unstable, but the GEE approach outperforms the other methods. PMID:24772290

  4. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  5. Estimating Risk: Stereotype Amplification and the Perceived Risk of Criminal Victimization

    ERIC Educational Resources Information Center

    Quillian, Lincoln; Pager, Devah

    2010-01-01

    This paper considers the process by which individuals estimate the risk of adverse events, with particular attention to the social context in which risk estimates are formed. We compare subjective probability estimates of crime victimization to actual victimization experiences among respondents from the 1994 to 2002 waves of the Survey of Economic…

  6. Estimating earthquake-induced failure probability and downtime of critical facilities.

    PubMed

    Porter, Keith; Ramer, Kyle

    2012-01-01

    Fault trees have long been used to estimate failure risk in earthquakes, especially for nuclear power plants (NPPs). One interesting application is that one can assess and manage the probability that two facilities - a primary and backup - would be simultaneously rendered inoperative in a single earthquake. Another is that one can calculate the probabilistic time required to restore a facility to functionality, and the probability that, during any given planning period, the facility would be rendered inoperative for any specified duration. A large new peer-reviewed library of component damageability and repair-time data for the first time enables fault trees to be used to calculate the seismic risk of operational failure and downtime for a wide variety of buildings other than NPPs. With the new library, seismic risk of both the failure probability and probabilistic downtime can be assessed and managed, considering the facility's unique combination of structural and non-structural components, their seismic installation conditions, and the other systems on which the facility relies. An example is offered of real computer data centres operated by a California utility. The fault trees were created and tested in collaboration with utility operators, and the failure probability and downtime results validated in several ways. PMID:22576139

  7. Estimating the Probability of Elevated Nitrate Concentrations in Ground Water in Washington State

    USGS Publications Warehouse

    Frans, Lonna M.

    2008-01-01

    Logistic regression was used to relate anthropogenic (manmade) and natural variables to the occurrence of elevated nitrate concentrations in ground water in Washington State. Variables that were analyzed included well depth, ground-water recharge rate, precipitation, population density, fertilizer application amounts, soil characteristics, hydrogeomorphic regions, and land-use types. Two models were developed: one with and one without the hydrogeomorphic regions variable. The variables in both models that best explained the occurrence of elevated nitrate concentrations (defined as concentrations of nitrite plus nitrate as nitrogen greater than 2 milligrams per liter) were the percentage of agricultural land use in a 4-kilometer radius of a well, population density, precipitation, soil drainage class, and well depth. Based on the relations between these variables and measured nitrate concentrations, logistic regression models were developed to estimate the probability of nitrate concentrations in ground water exceeding 2 milligrams per liter. Maps of Washington State were produced that illustrate these estimated probabilities for wells drilled to 145 feet below land surface (median well depth) and the estimated depth to which wells would need to be drilled to have a 90-percent probability of drawing water with a nitrate concentration less than 2 milligrams per liter. Maps showing the estimated probability of elevated nitrate concentrations indicated that the agricultural regions are most at risk followed by urban areas. The estimated depths to which wells would need to be drilled to have a 90-percent probability of obtaining water with nitrate concentrations less than 2 milligrams per liter exceeded 1,000 feet in the agricultural regions; whereas, wells in urban areas generally would need to be drilled to depths in excess of 400 feet.

  8. The conditional risk probability-based seawall height design method

    NASA Astrophysics Data System (ADS)

    Yang, Xing; Hu, Xiaodong; Li, Zhiqing

    2015-11-01

    The determination of the required seawall height is usually based on the combination of wind speed (or wave height) and still water level according to a specified return period, e.g., 50-year return period wind speed and 50-year return period still water level. In reality, the two variables are be partially correlated. This may be lead to over-design (costs) of seawall structures. The above-mentioned return period for the design of a seawall depends on economy, society and natural environment in the region. This means a specified risk level of overtopping or damage of a seawall structure is usually allowed. The aim of this paper is to present a conditional risk probability-based seawall height design method which incorporates the correlation of the two variables. For purposes of demonstration, the wind speeds and water levels collected from Jiangsu of China are analyzed. The results show this method can improve seawall height design accuracy.

  9. Compositional cokriging for mapping the probability risk of groundwater contamination by nitrates.

    PubMed

    Pardo-Igúzquiza, Eulogio; Chica-Olmo, Mario; Luque-Espinar, Juan A; Rodríguez-Galiano, Víctor

    2015-11-01

    Contamination by nitrates is an important cause of groundwater pollution and represents a potential risk to human health. Management decisions must be made using probability maps that assess the nitrate concentration potential of exceeding regulatory thresholds. However these maps are obtained with only a small number of sparse monitoring locations where the nitrate concentrations have been measured. It is therefore of great interest to have an efficient methodology for obtaining those probability maps. In this paper, we make use of the fact that the discrete probability density function is a compositional variable. The spatial discrete probability density function is estimated by compositional cokriging. There are several advantages in using this approach: (i) problems of classical indicator cokriging, like estimates outside the interval (0,1) and order relations, are avoided; (ii) secondary variables (e.g. aquifer parameters) can be included in the estimation of the probability maps; (iii) uncertainty maps of the probability maps can be obtained; (iv) finally there are modelling advantages because the variograms and cross-variograms of real variables that do not have the restrictions of indicator variograms and indicator cross-variograms. The methodology was applied to the Vega de Granada aquifer in Southern Spain and the advantages of the compositional cokriging approach were demonstrated. PMID:26070026

  10. Probability estimates of seismic event occurrence compared to health hazards - Forecasting Taipei's Earthquakes

    NASA Astrophysics Data System (ADS)

    Fung, D. C. N.; Wang, J. P.; Chang, S. H.; Chang, S. C.

    2014-12-01

    Using a revised statistical model built on past seismic probability models, the probability of different magnitude earthquakes occurring within variable timespans can be estimated. The revised model is based on Poisson distribution and includes the use of best-estimate values of the probability distribution of different magnitude earthquakes recurring from a fault from literature sources. Our study aims to apply this model to the Taipei metropolitan area with a population of 7 million, which lies in the Taipei Basin and is bounded by two normal faults: the Sanchaio and Taipei faults. The Sanchaio fault is suggested to be responsible for previous large magnitude earthquakes, such as the 1694 magnitude 7 earthquake in northwestern Taipei (Cheng et. al., 2010). Based on a magnitude 7 earthquake return period of 543 years, the model predicts the occurrence of a magnitude 7 earthquake within 20 years at 1.81%, within 79 years at 6.77% and within 300 years at 21.22%. These estimates increase significantly when considering a magnitude 6 earthquake; the chance of one occurring within the next 20 years is estimated to be 3.61%, 79 years at 13.54% and 300 years at 42.45%. The 79 year period represents the average lifespan of the Taiwan population. In contrast, based on data from 2013, the probability of Taiwan residents experiencing heart disease or malignant neoplasm is 11.5% and 29%. The inference of this study is that the calculated risk that the Taipei population is at from a potentially damaging magnitude 6 or greater earthquake occurring within their lifetime is just as great as of suffering from a heart attack or other health ailments.

  11. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  12. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2010-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish .

  13. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage.

  14. Application of Monte Carlo simulation to estimate probabilities for the best and health conservative estimates of receptor well concentrations

    SciTech Connect

    Not Available

    1989-08-01

    This report presents a Monte Carlo Simulation analysis of the fate and transport of Contaminants in groundwater at the Lawrence Livermore National Laboratory Livermore Site. The result of this analysis are the cumulative distribution function (CDF) of the maximum 70-year average and peak concentrations of the four chemicals of concern (TCE, PCE, chloroform, and other VOCs'') at the near-field and three far-field wells. These concentration CDFs can be used to estimate the probability of occurrence of the concentrations previously predicted using the deterministic model, and to conduct an enhanced exposure and risk assessment for the Remedial Investigation and Feasibility Study (RI/FS). This report provides a description of the deterministic fate and transport model (PLUME) which was linked to the Monte Carlo Shell to estimate the CDF of the receptor-well chemical concentrations. 6 refs., 21 figs., 12 tabs.

  15. Estimation of probability of failure for damage-tolerant aerospace structures

    NASA Astrophysics Data System (ADS)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  16. Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation

    NASA Astrophysics Data System (ADS)

    Demir, Uygar; Toker, Cenk; Çenet, Duygu

    2016-07-01

    Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent

  17. Estimating background and threshold nitrate concentrations using probability graphs

    USGS Publications Warehouse

    Panno, S.V.; Kelly, W.R.; Martinsek, A.T.; Hackley, Keith C.

    2006-01-01

    Because of the ubiquitous nature of anthropogenic nitrate (NO 3-) in many parts of the world, determining background concentrations of NO3- in shallow ground water from natural sources is probably impossible in most environments. Present-day background must now include diffuse sources of NO3- such as disruption of soils and oxidation of organic matter, and atmospheric inputs from products of combustion and evaporation of ammonia from fertilizer and livestock waste. Anomalies can be defined as NO3- derived from nitrogen (N) inputs to the environment from anthropogenic activities, including synthetic fertilizers, livestock waste, and septic effluent. Cumulative probability graphs were used to identify threshold concentrations separating background and anomalous NO3-N concentrations and to assist in the determination of sources of N contamination for 232 spring water samples and 200 well water samples from karst aquifers. Thresholds were 0.4, 2.5, and 6.7 mg/L for spring water samples, and 0.1, 2.1, and 17 mg/L for well water samples. The 0.4 and 0.1 mg/L values are assumed to represent thresholds for present-day precipitation. Thresholds at 2.5 and 2.1 mg/L are interpreted to represent present-day background concentrations of NO3-N. The population of spring water samples with concentrations between 2.5 and 6.7 mg/L represents an amalgam of all sources of NO3- in the ground water basins that feed each spring; concentrations >6.7 mg/L were typically samples collected soon after springtime application of synthetic fertilizer. The 17 mg/L threshold (adjusted to 15 mg/L) for well water samples is interpreted as the level above which livestock wastes dominate the N sources. Copyright ?? 2006 The Author(s).

  18. Estimating the risk of Amazonian forest dieback.

    PubMed

    Rammig, Anja; Jupp, Tim; Thonicke, Kirsten; Tietjen, Britta; Heinke, Jens; Ostberg, Sebastian; Lucht, Wolfgang; Cramer, Wolfgang; Cox, Peter

    2010-08-01

    *Climate change will very likely affect most forests in Amazonia during the course of the 21st century, but the direction and intensity of the change are uncertain, in part because of differences in rainfall projections. In order to constrain this uncertainty, we estimate the probability for biomass change in Amazonia on the basis of rainfall projections that are weighted by climate model performance for current conditions. *We estimate the risk of forest dieback by using weighted rainfall projections from 24 general circulation models (GCMs) to create probability density functions (PDFs) for future forest biomass changes simulated by a dynamic vegetation model (LPJmL). *Our probabilistic assessment of biomass change suggests a likely shift towards increasing biomass compared with nonweighted results. Biomass estimates range between a gain of 6.2 and a loss of 2.7 kg carbon m(-2) for the Amazon region, depending on the strength of CO(2) fertilization. *The uncertainty associated with the long-term effect of CO(2) is much larger than that associated with precipitation change. This underlines the importance of reducing uncertainties in the direct effects of CO(2) on tropical ecosystems. PMID:20553387

  19. EMPIRICAL GENERAL POPULATION ASSESSMENT OF THE HORVITZ-THOMPSON ESTIMATOR UNDER VARIABLE PROBABILITY SAMPLING

    EPA Science Inventory

    The variance and two estimators of variance of the Horvitz-Thompson estimator were studied under randomized, variable probability systematic sampling. hree bivariate distributions, representing the populations, were investigated empirically, with each distribution studied for thr...

  20. Robust location and spread measures for nonparametric probability density function estimation.

    PubMed

    López-Rubio, Ezequiel

    2009-10-01

    Robustness against outliers is a desirable property of any unsupervised learning scheme. In particular, probability density estimators benefit from incorporating this feature. A possible strategy to achieve this goal is to substitute the sample mean and the sample covariance matrix by more robust location and spread estimators. Here we use the L1-median to develop a nonparametric probability density function (PDF) estimator. We prove its most relevant properties, and we show its performance in density estimation and classification applications. PMID:19885963

  1. Student Estimates of Probability and Uncertainty in Statistical Physics

    NASA Astrophysics Data System (ADS)

    Mountcastle, Donald B.; Bucy, B. R.; Thompson, J. R.

    2006-12-01

    Equilibrium properties of macroscopic (large N) systems are highly predictable as N approaches and exceeds Avogadro’s number. Theories of statistical physics depend on these results. Typical pedagogical devices used in statistical physics textbooks to introduce entropy (S) and multiplicity [S = k ln(w), where w is the system multiplicity] include flipping coins and/or other equivalent binary events, repeated n times. Prior to instruction, our students usually give reasonable answers about the probabilities, but not the uncertainties of the predicted outcomes of such events. However, they reliably predict that the uncertainty in a measured quantity (e.g., the amount of rainfall) decreases as the number of measurements increases. Typical textbook presentations presume that students will either have or develop the insight that the relative uncertainty of binary outcomes will similarly decrease as the number of events increases. That is at odds with our findings among students in two successive statistical mechanics classes. Many of our students had previously completed mathematics courses in statistics, as well as a physics laboratory course that included analysis of statistical properties of distributions of dart scores as the number (n) of throws (one-dimensional target) increased. There was a wide divergence of predictions about how the standard deviation of the distribution of dart scores should change, or not, as n increases. We find that student predictions about statistics of coin flips, dart scores, and rainfall amounts as functions of n are inconsistent at best. Supported in part by NSF Grant #PHY-0406764.

  2. Estimation of the size of a closed population when capture probabilities vary among animals

    USGS Publications Warehouse

    Burnham, K.P.; Overton, W.S.

    1978-01-01

    A model which allows capture probabilities to vary by individuals is introduced for multiple recapture studies n closed populations. The set of individual capture probabilities is modelled as a random sample from an arbitrary probability distribution over the unit interval. We show that the capture frequencies are a sufficient statistic. A nonparametric estimator of population size is developed based on the generalized jackknife; this estimator is found to be a linear combination of the capture frequencies. Finally, tests of underlying assumptions are presented.

  3. Estimating the Probability of Earthquake-Induced Landslides

    NASA Astrophysics Data System (ADS)

    McRae, M. E.; Christman, M. C.; Soller, D. R.; Sutter, J. F.

    2001-12-01

    The development of a regionally applicable, predictive model for earthquake-triggered landslides is needed to improve mitigation decisions at the community level. The distribution of landslides triggered by the 1994 Northridge earthquake in the Oat Mountain and Simi Valley quadrangles of southern California provided an inventory of failures against which to evaluate the significance of a variety of physical variables in probabilistic models of static slope stability. Through a cooperative project, the California Division of Mines and Geology provided 10-meter resolution data on elevation, slope angle, coincidence of bedding plane and topographic slope, distribution of pre-Northridge landslides, internal friction angle and cohesive strength of individual geologic units. Hydrologic factors were not evaluated since failures in the study area were dominated by shallow, disrupted landslides in dry materials. Previous studies indicate that 10-meter digital elevation data is required to properly characterize the short, steep slopes on which many earthquake-induced landslides occur. However, to explore the robustness of the model at different spatial resolutions, models were developed at the 10, 50, and 100-meter resolution using classification and regression tree (CART) analysis and logistic regression techniques. Multiple resampling algorithms were tested for each variable in order to observe how resampling affects the statistical properties of each grid, and how relationships between variables within the model change with increasing resolution. Various transformations of the independent variables were used to see which had the strongest relationship with the probability of failure. These transformations were based on deterministic relationships in the factor of safety equation. Preliminary results were similar for all spatial scales. Topographic variables dominate the predictive capability of the models. The distribution of prior landslides and the coincidence of slope

  4. A simulation model for estimating probabilities of defects in welds

    SciTech Connect

    Chapman, O.J.V.; Khaleel, M.A.; Simonen, F.A.

    1996-12-01

    In recent work for the US Nuclear Regulatory Commission in collaboration with Battelle Pacific Northwest National Laboratory, Rolls-Royce and Associates, Ltd., has adapted an existing model for piping welds to address welds in reactor pressure vessels. This paper describes the flaw estimation methodology as it applies to flaws in reactor pressure vessel welds (but not flaws in base metal or flaws associated with the cladding process). Details of the associated computer software (RR-PRODIGAL) are provided. The approach uses expert elicitation and mathematical modeling to simulate the steps in manufacturing a weld and the errors that lead to different types of weld defects. The defects that may initiate in weld beads include center cracks, lack of fusion, slag, pores with tails, and cracks in heat affected zones. Various welding processes are addressed including submerged metal arc welding. The model simulates the effects of both radiographic and dye penetrant surface inspections. Output from the simulation gives occurrence frequencies for defects as a function of both flaw size and flaw location (surface connected and buried flaws). Numerical results are presented to show the effects of submerged metal arc versus manual metal arc weld processes.

  5. Estimating Risk: Stereotype Amplification and the Perceived Risk of Criminal Victimization.

    PubMed

    Quillian, Lincoln; Pager, Devah

    2010-03-01

    This paper considers the process by which individuals estimate the risk of adverse events, with particular attention to the social context in which risk estimates are formed. We compare subjective probability estimates of crime victimization to actual victimization experiences among respondents from the 1994 to 2002 waves of the Survey of Economic Expectations (Dominitz and Manski 2002). Using zip code identifiers, we then match these survey data to local area characteristics from the census. The results show that: (1) the risk of criminal victimization is significantly overestimated relative to actual rates of victimization or other negative events; (2) neighborhood racial composition is strongly associated with perceived risk of victimization, whereas actual victimization risk is driven by nonracial neighborhood characteristics; and (3) white respondents appear more strongly affected by racial composition than nonwhites in forming their estimates of risk. We argue these results support a model of stereotype amplification in the formation of risk estimates. Implications for persistent racial inequality are considered. PMID:20686631

  6. Estimating Risk: Stereotype Amplification and the Perceived Risk of Criminal Victimization

    PubMed Central

    QUILLIAN, LINCOLN; PAGER, DEVAH

    2010-01-01

    This paper considers the process by which individuals estimate the risk of adverse events, with particular attention to the social context in which risk estimates are formed. We compare subjective probability estimates of crime victimization to actual victimization experiences among respondents from the 1994 to 2002 waves of the Survey of Economic Expectations (Dominitz and Manski 2002). Using zip code identifiers, we then match these survey data to local area characteristics from the census. The results show that: (1) the risk of criminal victimization is significantly overestimated relative to actual rates of victimization or other negative events; (2) neighborhood racial composition is strongly associated with perceived risk of victimization, whereas actual victimization risk is driven by nonracial neighborhood characteristics; and (3) white respondents appear more strongly affected by racial composition than nonwhites in forming their estimates of risk. We argue these results support a model of stereotype amplification in the formation of risk estimates. Implications for persistent racial inequality are considered. PMID:20686631

  7. Extreme Earthquake Risk Estimation by Hybrid Modeling

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Garcia, S.; Emerson, D.; Perea, N.; Salazar, A.; Moulinec, C.

    2012-12-01

    The estimation of the hazard and the economical consequences i.e. the risk associated to the occurrence of extreme magnitude earthquakes in the neighborhood of urban or lifeline infrastructure, such as the 11 March 2011 Mw 9, Tohoku, Japan, represents a complex challenge as it involves the propagation of seismic waves in large volumes of the earth crust, from unusually large seismic source ruptures up to the infrastructure location. The large number of casualties and huge economic losses observed for those earthquakes, some of which have a frequency of occurrence of hundreds or thousands of years, calls for the development of new paradigms and methodologies in order to generate better estimates, both of the seismic hazard, as well as of its consequences, and if possible, to estimate the probability distributions of their ground intensities and of their economical impacts (direct and indirect losses), this in order to implement technological and economical policies to mitigate and reduce, as much as possible, the mentioned consequences. Herewith, we propose a hybrid modeling which uses 3D seismic wave propagation (3DWP) and neural network (NN) modeling in order to estimate the seismic risk of extreme earthquakes. The 3DWP modeling is achieved by using a 3D finite difference code run in the ~100 thousands cores Blue Gene Q supercomputer of the STFC Daresbury Laboratory of UK, combined with empirical Green function (EGF) techniques and NN algorithms. In particular the 3DWP is used to generate broadband samples of the 3D wave propagation of extreme earthquakes (plausible) scenarios corresponding to synthetic seismic sources and to enlarge those samples by using feed-forward NN. We present the results of the validation of the proposed hybrid modeling for Mw 8 subduction events, and show examples of its application for the estimation of the hazard and the economical consequences, for extreme Mw 8.5 subduction earthquake scenarios with seismic sources in the Mexican

  8. Climate-informed flood risk estimation

    NASA Astrophysics Data System (ADS)

    Troy, T.; Devineni, N.; Lima, C.; Lall, U.

    2013-12-01

    Currently, flood risk assessments are typically tied to a peak flow event that has an associated return period and inundation extent. This method is convenient: based on a historical record of annual maximum flows, a return period can be calculated with some assumptions about the probability distribution and stationarity. It is also problematic in its stationarity assumption, reliance on relatively short records, and treating flooding as a random event disconnected from large-scale climate processes. Recognizing these limitations, we have developed a new approach to flood risk assessment that connects climate variability, precipitation dynamics, and flood modeling to estimate the likelihood of flooding. To provide more robust, long time series of precipitation, we used stochastic weather generator models to simulate the rainfall fields. The method uses a k-nearest neighbor resampling algorithm in conjunction with a non-parametric empirical copulas based simulation strategy to reproduce the temporal and spatial dynamics, respectively. Climate patterns inform the likelihood of heavy rainfall in the model. For example, ENSO affects the likelihood of wet or dry years in Australia, and this is incorporated in the model. The stochastic simulations are then used to drive a cascade of models to predict flood inundation. Runoff is generated by the Variable Infiltration Capacity (VIC) model, fed into a full kinematic wave routing model at high resolution, and the kinematic wave is used as a boundary condition to predict flood inundation using a coupled storage cell model. Combining the strengths of a stochastic model for rainfall and a physical model for flood prediction allows us to overcome the limitations of traditional flood risk assessment and provide robust estimates of flood risk.

  9. Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change

    NASA Astrophysics Data System (ADS)

    Field, R.; Constantine, P.; Boslough, M.

    2011-12-01

    We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We

  10. An Illustration of Inverse Probability Weighting to Estimate Policy-Relevant Causal Effects.

    PubMed

    Edwards, Jessie K; Cole, Stephen R; Lesko, Catherine R; Mathews, W Christopher; Moore, Richard D; Mugavero, Michael J; Westreich, Daniel

    2016-08-15

    Traditional epidemiologic approaches allow us to compare counterfactual outcomes under 2 exposure distributions, usually 100% exposed and 100% unexposed. However, to estimate the population health effect of a proposed intervention, one may wish to compare factual outcomes under the observed exposure distribution to counterfactual outcomes under the exposure distribution produced by an intervention. Here, we used inverse probability weights to compare the 5-year mortality risk under observed antiretroviral therapy treatment plans to the 5-year mortality risk that would had been observed under an intervention in which all patients initiated therapy immediately upon entry into care among patients positive for human immunodeficiency virus in the US Centers for AIDS Research Network of Integrated Clinical Systems multisite cohort study between 1998 and 2013. Therapy-naïve patients (n = 14,700) were followed from entry into care until death, loss to follow-up, or censoring at 5 years or on December 31, 2013. The 5-year cumulative incidence of mortality was 11.65% under observed treatment plans and 10.10% under the intervention, yielding a risk difference of -1.57% (95% confidence interval: -3.08, -0.06). Comparing outcomes under the intervention with outcomes under observed treatment plans provides meaningful information about the potential consequences of new US guidelines to treat all patients with human immunodeficiency virus regardless of CD4 cell count under actual clinical conditions. PMID:27469514

  11. Small-Area Estimation of the Probability of Toxocariasis in New York City Based on Sociodemographic Neighborhood Composition

    PubMed Central

    Walsh, Michael G.; Haseeb, M. A.

    2014-01-01

    Toxocariasis is increasingly recognized as an important neglected infection of poverty (NIP) in developed countries, and may constitute the most important NIP in the United States (US) given its association with chronic sequelae such as asthma and poor cognitive development. Its potential public health burden notwithstanding, toxocariasis surveillance is minimal throughout the US and so the true burden of disease remains uncertain in many areas. The Third National Health and Nutrition Examination Survey conducted a representative serologic survey of toxocariasis to estimate the prevalence of infection in diverse US subpopulations across different regions of the country. Using the NHANES III surveillance data, the current study applied the predicted probabilities of toxocariasis to the sociodemographic composition of New York census tracts to estimate the local probability of infection across the city. The predicted probability of toxocariasis ranged from 6% among US-born Latino women with a university education to 57% among immigrant men with less than a high school education. The predicted probability of toxocariasis exhibited marked spatial variation across the city, with particularly high infection probabilities in large sections of Queens, and smaller, more concentrated areas of Brooklyn and northern Manhattan. This investigation is the first attempt at small-area estimation of the probability surface of toxocariasis in a major US city. While this study does not define toxocariasis risk directly, it does provide a much needed tool to aid the development of toxocariasis surveillance in New York City. PMID:24918785

  12. Estimating site occupancy and species detection probability parameters for terrestrial salamanders

    USGS Publications Warehouse

    Bailey, L.L.; Simons, T.R.; Pollock, K.H.

    2004-01-01

    Recent, worldwide amphibian declines have highlighted a need for more extensive and rigorous monitoring programs to document species occurrence and detect population change. Abundance estimation methods, such as mark-recapture, are often expensive and impractical for large-scale or long-term amphibian monitoring. We apply a new method to estimate proportion of area occupied using detection/nondetection data from a terrestrial salamander system in Great Smoky Mountains National Park. Estimated species-specific detection probabilities were all <1 and varied among seven species and four sampling methods. Time (i.e., sampling occasion) and four large-scale habitat characteristics (previous disturbance history, vegetation type, elevation, and stream presence) were important covariates in estimates of both proportion of area occupied and detection probability. All sampling methods were consistent in their ability to identify important covariates for each salamander species. We believe proportion of area occupied represents a useful state variable for large-scale monitoring programs. However, our results emphasize the importance of estimating detection and occupancy probabilities rather than using an unadjusted proportion of sites where species are observed where actual occupancy probabilities are confounded with detection probabilities. Estimated detection probabilities accommodate variations in sampling effort; thus comparisons of occupancy probabilities are possible among studies with different sampling protocols.

  13. Children's Ability to Make Probability Estimates: Skills Revealed through Application of Anderson's Functional Measurement Methodology.

    ERIC Educational Resources Information Center

    Acredolo, Curt; And Others

    1989-01-01

    Two studies assessed 90 elementary school students' attention to the total number of alternative and target outcomes when making probability estimates. All age groups attended to variations in the denominator and numerator and the interaction between these variables. (RJC)

  14. Procedures for using expert judgment to estimate human-error probabilities in nuclear power plant operations. [PWR; BWR

    SciTech Connect

    Seaver, D.A.; Stillwell, W.G.

    1983-03-01

    This report describes and evaluates several procedures for using expert judgment to estimate human-error probabilities (HEPs) in nuclear power plant operations. These HEPs are currently needed for several purposes, particularly for probabilistic risk assessments. Data do not exist for estimating these HEPs, so expert judgment can provide these estimates in a timely manner. Five judgmental procedures are described here: paired comparisons, ranking and rating, direct numerical estimation, indirect numerical estimation and multiattribute utility measurement. These procedures are evaluated in terms of several criteria: quality of judgments, difficulty of data collection, empirical support, acceptability, theoretical justification, and data processing. Situational constraints such as the number of experts available, the number of HEPs to be estimated, the time available, the location of the experts, and the resources available are discussed in regard to their implications for selecting a procedure for use.

  15. How to estimate the Value at Risk under incomplete information

    NASA Astrophysics Data System (ADS)

    de Schepper, Ann; Heijnen, Bart

    2010-03-01

    A key problem in financial and actuarial research, and particularly in the field of risk management, is the choice of models so as to avoid systematic biases in the measurement of risk. An alternative consists of relaxing the assumption that the probability distribution is completely known, leading to interval estimates instead of point estimates. In the present contribution, we show how this is possible for the Value at Risk, by fixing only a small number of parameters of the underlying probability distribution. We start by deriving bounds on tail probabilities, and we show how a conversion leads to bounds for the Value at Risk. It will turn out that with a maximum of three given parameters, the best estimates are always realized in the case of a unimodal random variable for which two moments and the mode are given. It will also be shown that a lognormal model results in estimates for the Value at Risk that are much closer to the upper bound than to the lower bound.

  16. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  17. Methodology for estimating extreme winds for probabilistic risk assessments

    SciTech Connect

    Ramsdell, J.V.; Elliott, D.L.; Holladay, C.G.; Hubbe, J.M.

    1986-10-01

    The US Nuclear Reguulatory Commission (NRC) assesses the risks associated with nuclear faciliies using techniques that fall under a generic name of Probabilistic Risk Assessment. In these assessments, potential accident sequences are traced from initiating event to final outcome. At each step of the sequence, a probability of occurrence is assigned to each available alternative. Ultimately, the probability of occurrence of each possible outcome is determined from the probabilities assigned to the initiating events and the alternative paths. Extreme winds are considered in these sequences. As a result, it is necessary to estimate extreme wind probabilities as low as 10/sup -7/yr/sup -1/. When the NRC staff is called on to provide extreme wind estimates, the staff is likely to be subjected to external time and funding constraints. These constraints dictate that the estimates be based on readily available wind data. In general, readily available data will be limited to the data provided by the facility applicant or licensee and the data archived at the National Climatic Data Center in Asheville, North Carolina. This report describes readily available data that can be used in estimating extreme wind probabilities, procedures of screening the data to eliminate erroneous values and for adjusting data to compensate for differences in data collection methods, and statistical methods for making extreme wind estimates. Supporting technical details are presented in several appendices. Estimation of extreme wind probabilities at a given location involves many subjective decisions. The procedures described do not eliminate all of the subjectivity, but they do increase the reproducibility of the analysis. They provide consistent methods for determining probabilities given a set of subjective decisions. By following these procedures, subjective decisions can be identified and documented.

  18. Estimating functions of probability distributions from a finite set of samples

    SciTech Connect

    Wolpert, D.H.; Wolf, D.R. |

    1995-12-01

    This paper addresses the problem of estimating a function of a probability distribution from a finite set of samples of that distribution. A Bayesian analysis of this problem is presented, the optimal properties of the Bayes estimators are discussed, and as an example of the formalism, closed form expressions for the Bayes estimators for the moments of the Shannon entropy function are derived. Then numerical results are presented that compare the Bayes estimator to the frequency-counts estimator for the Shannon entropy. We also present the closed form estimators, all derived elsewhere, for the mutual information, {chi}{sup 2} covariance, and some other statistics. (c) 1995 The American Physical Society

  19. A double-observer approach for estimating detection probability and abundance from point counts

    USGS Publications Warehouse

    Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Fallon, F.W.; Fallon, J.E.; Heglund, P.J.

    2000-01-01

    Although point counts are frequently used in ornithological studies, basic assumptions about detection probabilities often are untested. We apply a double-observer approach developed to estimate detection probabilities for aerial surveys (Cook and Jacobson 1979) to avian point counts. At each point count, a designated 'primary' observer indicates to another ('secondary') observer all birds detected. The secondary observer records all detections of the primary observer as well as any birds not detected by the primary observer. Observers alternate primary and secondary roles during the course of the survey. The approach permits estimation of observer-specific detection probabilities and bird abundance. We developed a set of models that incorporate different assumptions about sources of variation (e.g. observer, bird species) in detection probability. Seventeen field trials were conducted, and models were fit to the resulting data using program SURVIV. Single-observer point counts generally miss varying proportions of the birds actually present, and observer and bird species were found to be relevant sources of variation in detection probabilities. Overall detection probabilities (probability of being detected by at least one of the two observers) estimated using the double-observer approach were very high (>0.95), yielding precise estimates of avian abundance. We consider problems with the approach and recommend possible solutions, including restriction of the approach to fixed-radius counts to reduce the effect of variation in the effective radius of detection among various observers and to provide a basis for using spatial sampling to estimate bird abundance on large areas of interest. We believe that most questions meriting the effort required to carry out point counts also merit serious attempts to estimate detection probabilities associated with the counts. The double-observer approach is a method that can be used for this purpose.

  20. Total Probability of Collision as a Metric for Finite Conjunction Assessment and Collision Risk Management

    NASA Astrophysics Data System (ADS)

    Frigm, R.; Johnson, L.

    The Probability of Collision (Pc) has become a universal metric and statement of on-orbit collision risk. Although several flavors of the computation exist and are well-documented in the literature, the basic calculation requires the same input: estimates for the position, position uncertainty, and sizes of the two objects involved. The Pc is used operationally to make decisions on whether a given conjunction poses significant collision risk to the primary object (or space asset of concern). It is also used to determine necessity and degree of mitigative action (typically in the form of an orbital maneuver) to be performed. The predicted post-maneuver Pc also informs the maneuver planning process into regarding the timing, direction, and magnitude of the maneuver needed to mitigate the collision risk. Although the data sources, techniques, decision calculus, and workflows vary for different agencies and organizations, they all have a common thread. The standard conjunction assessment and collision risk concept of operations (CONOPS) predicts conjunctions, assesses the collision risk (typically, via the Pc), and plans and executes avoidance activities for conjunctions as a discrete events. As the space debris environment continues to increase and improvements are made to remote sensing capabilities and sensitivities to detect, track, and predict smaller debris objects, the number of conjunctions will in turn continue to increase. The expected order-of-magnitude increase in the number of predicted conjunctions will challenge the paradigm of treating each conjunction as a discrete event. The challenge will not be limited to workload issues, such as manpower and computing performance, but also the ability for satellite owner/operators to successfully execute their mission while also managing on-orbit collision risk. Executing a propulsive maneuver occasionally can easily be absorbed into the mission planning and operations tempo; whereas, continuously planning evasive

  1. Generalizations and Extensions of the Probability of Superiority Effect Size Estimator

    ERIC Educational Resources Information Center

    Ruscio, John; Gera, Benjamin Lee

    2013-01-01

    Researchers are strongly encouraged to accompany the results of statistical tests with appropriate estimates of effect size. For 2-group comparisons, a probability-based effect size estimator ("A") has many appealing properties (e.g., it is easy to understand, robust to violations of parametric assumptions, insensitive to outliers). We review…

  2. Improving quality of sample entropy estimation for continuous distribution probability functions

    NASA Astrophysics Data System (ADS)

    Miśkiewicz, Janusz

    2016-05-01

    Entropy is a one of the key parameters characterizing state of system in statistical physics. Although, the entropy is defined for systems described by discrete and continuous probability distribution function (PDF), in numerous applications the sample entropy is estimated by a histogram, which, in fact, denotes that the continuous PDF is represented by a set of probabilities. Such a procedure may lead to ambiguities and even misinterpretation of the results. Within this paper, two possible general algorithms based on continuous PDF estimation are discussed in the application to the Shannon and Tsallis entropies. It is shown that the proposed algorithms may improve entropy estimation, particularly in the case of small data sets.

  3. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2000-01-01

    We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.

  4. Estimating transition probabilities for stage-based population projection matrices using capture-recapture data

    USGS Publications Warehouse

    Nichols, J.D.; Sauer, J.R.; Pollock, K.H.; Hestbeck, J.B.

    1992-01-01

    In stage-based demography, animals are often categorized into size (or mass) classes, and size-based probabilities of surviving and changing mass classes must be estimated before demographic analyses can be conducted. In this paper, we develop two procedures for the estimation of mass transition probabilities from capture-recapture data. The first approach uses a multistate capture-recapture model that is parameterized directly with the transition probabilities of interest. Maximum likelihood estimates are then obtained numerically using program SURVIV. The second approach involves a modification of Pollock's robust design. Estimation proceeds by conditioning on animals caught in a particualr class at time i, and then using closed models to estimate the number of these that are alive in other classes at i + 1. Both methods are illustrated by application to meadow vole, Microtus pennsylvanicus, capture-recapture data. The two methods produced reasonable estimates that were similar. Advantages of these two approaches include the directness of estimation, the absence of need for restrictive assumptions about the independence of survival and growth, the testability of assumptions, and the testability of related hypotheses of ecological interest (e.g., the hypothesis of temporal variation in transition probabilities).

  5. Estimate of tephra accumulation probabilities for the U.S. Department of Energy's Hanford Site, Washington

    USGS Publications Warehouse

    Hoblitt, Richard P.; Scott, William E.

    2011-01-01

    In response to a request from the U.S. Department of Energy, we estimate the thickness of tephra accumulation that has an annual probability of 1 in 10,000 of being equaled or exceeded at the Hanford Site in south-central Washington State, where a project to build the Tank Waste Treatment and Immobilization Plant is underway. We follow the methodology of a 1987 probabilistic assessment of tephra accumulation in the Pacific Northwest. For a given thickness of tephra, we calculate the product of three probabilities: (1) the annual probability of an eruption producing 0.1 km3 (bulk volume) or more of tephra, (2) the probability that the wind will be blowing toward the Hanford Site, and (3) the probability that tephra accumulations will equal or exceed the given thickness at a given distance. Mount St. Helens, which lies about 200 km upwind from the Hanford Site, has been the most prolific source of tephra fallout among Cascade volcanoes in the recent geologic past and its annual eruption probability based on this record (0.008) dominates assessment of future tephra falls at the site. The probability that the prevailing wind blows toward Hanford from Mount St. Helens is 0.180. We estimate exceedance probabilities of various thicknesses of tephra fallout from an analysis of 14 eruptions of the size expectable from Mount St. Helens and for which we have measurements of tephra fallout at 200 km. The result is that the estimated thickness of tephra accumulation that has an annual probability of 1 in 10,000 of being equaled or exceeded is about 10 centimeters. It is likely that this thickness is a maximum estimate because we used conservative estimates of eruption and wind probabilities and because the 14 deposits we used probably provide an over-estimate. The use of deposits in this analysis that were mostly compacted by the time they were studied and measured implies that the bulk density of the tephra fallout we consider here is in the range of 1,000-1,250 kg/m3. The

  6. Using of bayesian networks to estimate the probability of "NATECH" scenario occurrence

    NASA Astrophysics Data System (ADS)

    Dobes, Pavel; Dlabka, Jakub; Jelšovská, Katarína; Polorecká, Mária; Baudišová, Barbora; Danihelka, Pavel

    2015-04-01

    In the twentieth century, implementation of Bayesian statistics and probability was not much used (may be it wasn't a preferred approach) in the area of natural and industrial risk analysis and management. Neither it was used within analysis of so called NATECH accidents (chemical accidents triggered by natural events, such as e.g. earthquakes, floods, lightning etc.; ref. E. Krausmann, 2011, doi:10.5194/nhess-11-921-2011). Main role, from the beginning, played here so called "classical" frequentist probability (ref. Neyman, 1937), which rely up to now especially on the right/false results of experiments and monitoring and didn't enable to count on expert's beliefs, expectations and judgements (which is, on the other hand, one of the once again well known pillars of Bayessian approach to probability). In the last 20 or 30 years, there is possible to observe, through publications and conferences, the Renaissance of Baysssian statistics into many scientific disciplines (also into various branches of geosciences). The necessity of a certain level of trust in expert judgment within risk analysis is back? After several decades of development on this field, it could be proposed following hypothesis (to be checked): "We couldn't estimate probabilities of complex crisis situations and their TOP events (many NATECH events could be classified as crisis situations or emergencies), only by classical frequentist approach, but also by using of Bayessian approach (i.e. with help of prestaged Bayessian Network including expert belief and expectation as well as classical frequentist inputs). Because - there is not always enough quantitative information from monitoring of historical emergencies, there could be several dependant or independant variables necessary to consider and in generally - every emergency situation always have a little different run." In this topic, team of authors presents its proposal of prestaged typized Bayessian network model for specified NATECH scenario

  7. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2002-01-01

    Use of point-count surveys is a popular method for collecting data on abundance and distribution of birds. However, analyses of such data often ignore potential differences in detection probability. We adapted a removal model to directly estimate detection probability during point-count surveys. The model assumes that singing frequency is a major factor influencing probability of detection when birds are surveyed using point counts. This may be appropriate for surveys in which most detections are by sound. The model requires counts to be divided into several time intervals. Point counts are often conducted for 10 min, where the number of birds recorded is divided into those first observed in the first 3 min, the subsequent 2 min, and the last 5 min. We developed a maximum-likelihood estimator for the detectability of birds recorded during counts divided into those intervals. This technique can easily be adapted to point counts divided into intervals of any length. We applied this method to unlimited-radius counts conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. We found differences in detection probability among species. Species that sing frequently such as Winter Wren (Troglodytes troglodytes) and Acadian Flycatcher (Empidonax virescens) had high detection probabilities (~90%) and species that call infrequently such as Pileated Woodpecker (Dryocopus pileatus) had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. We used the same approach to estimate detection probability and density for a subset of the observations with limited-radius point counts.

  8. Estimating probabilities of reservoir storage for the upper Delaware River basin

    USGS Publications Warehouse

    Hirsch, Robert M.

    1981-01-01

    A technique for estimating conditional probabilities of reservoir system storage is described and applied to the upper Delaware River Basin. The results indicate that there is a 73 percent probability that the three major New York City reservoirs (Pepacton, Cannonsville, and Neversink) would be full by June 1, 1981, and only a 9 percent probability that storage would return to the ' drought warning ' sector of the operations curve sometime in the next year. In contrast, if restrictions are lifted and there is an immediate return to normal operating policies, the probability of the reservoir system being full by June 1 is 37 percent and the probability that storage would return to the ' drought warning ' sector in the next year is 30 percent. (USGS)

  9. Relating space radiation environments to risk estimates

    SciTech Connect

    Curtis, S.B.

    1991-10-01

    This lecture will provide a bridge from the physical energy or LET spectra as might be calculated in an organ to the risk of carcinogenesis, a particular concern for extended missions to the moon or beyond to Mars. Topics covered will include (1) LET spectra expected from galactic cosmic rays, (2) probabilities that individual cell nuclei in the body will be hit by heavy galactic cosmic ray particles, (3) the conventional methods of calculating risks from a mixed environment of high and low LET radiation, (4) an alternate method which provides certain advantages using fluence-related risk coefficients (risk cross sections), and (5) directions for future research and development of these ideas.

  10. Estimate of the probability of a lightning strike to the Galileo probe

    NASA Technical Reports Server (NTRS)

    Borucki, W. J.

    1985-01-01

    Lightning strikes to aerospace vehicles occur mainly in or near clouds. As the Galileo entry probe will pass most of its operational life in the clouds of Jupiter, which is known to have lightning activity, the present study is concerned with the risk of a lightning strike to the probe. A strike to the probe could cause physical damage to the structure and/or damage to the electronic equipment aboard the probe. It is thought to be possible, for instance, that the instrument failures which occurred on all four Pioneer Venus entry probes at an altitude of 12 km were due to an external electric discharge. The probability of a lightning strike to the Galileo probe is evaluated. It is found that the estimate of a strike to the probe is only 0.001, which is about the same as the expected failure rate due to other design factors. In the case of entry probes to cloud-covered planets, a consideration of measures for protecting the vehicle and its payload from lightning appears to be appropriate.

  11. Estimate of the probability of a lightning strike to the Galileo probe

    NASA Astrophysics Data System (ADS)

    Borucki, W. J.

    1985-04-01

    Lightning strikes to aerospace vehicles occur mainly in or near clouds. As the Galileo entry probe will pass most of its operational life in the clouds of Jupiter, which is known to have lightning activity, the present study is concerned with the risk of a lightning strike to the probe. A strike to the probe could cause physical damage to the structure and/or damage to the electronic equipment aboard the probe. It is thought to be possible, for instance, that the instrument failures which occurred on all four Pioneer Venus entry probes at an altitude of 12 km were due to an external electric discharge. The probability of a lightning strike to the Galileo probe is evaluated. It is found that the estimate of a strike to the probe is only 0.001, which is about the same as the expected failure rate due to other design factors. In the case of entry probes to cloud-covered planets, a consideration of measures for protecting the vehicle and its payload from lightning appears to be appropriate.

  12. Estimation of Hail Risk in the UK and Europe

    NASA Astrophysics Data System (ADS)

    Robinson, Eric; Parker, Melanie; Higgs, Stephanie

    2016-04-01

    Observations of hail events in Europe, and the UK especially, are relatively limited. In order to determine hail risk it is therefore necessary to use information other than relying purely on the historical record. One such methodology is to leverage reanalysis data, in this case ERA-Interim, along with a numerical model (WRF) to recreate the past state of the atmosphere. Relevant atmospheric properties can be extracted and used in a regression model to determine hail probability for each day contained within the reanalyses. The results presented here show the results of using a regression model based on convective available potential energy, deep level shear and weather type. Combined these parameters represent the probability of severe thunderstorm, and in turn hail, activity. Once the probability of hail occurring on each day is determined this can be used as the basis of a stochastic catalogue which can be used in the estimation of hail risk.

  13. Realistic Probability Estimates For Destructive Overpressure Events In Heated Center Wing Tanks Of Commercial Jet Aircraft

    SciTech Connect

    Alvares, N; Lambert, H

    2007-02-07

    The Federal Aviation Administration (FAA) identified 17 accidents that may have resulted from fuel tank explosions on commercial aircraft from 1959 to 2001. Seven events involved JP 4 or JP 4/Jet A mixtures that are no longer used for commercial aircraft fuel. The remaining 10 events involved Jet A or Jet A1 fuels that are in current use by the commercial aircraft industry. Four fuel tank explosions occurred in center wing tanks (CWTs) where on-board appliances can potentially transfer heat to the tank. These tanks are designated as ''Heated Center Wing Tanks'' (HCWT). Since 1996, the FAA has significantly increased the rate at which it has mandated airworthiness directives (ADs) directed at elimination of ignition sources. This effort includes the adoption, in 2001, of Special Federal Aviation Regulation 88 of 14 CFR part 21 (SFAR 88 ''Fuel Tank System Fault Tolerance Evaluation Requirements''). This paper addresses SFAR 88 effectiveness in reducing HCWT ignition source probability. Our statistical analysis, relating the occurrence of both on-ground and in-flight HCWT explosions to the cumulative flight hours of commercial passenger aircraft containing HCWT's reveals that the best estimate of HCWT explosion rate is 1 explosion in 1.4 x 10{sup 8} flight hours. Based on an analysis of SFAR 88 by Sandia National Laboratories and our independent analysis, SFAR 88 reduces current risk of historical HCWT explosion by at least a factor of 10, thus meeting an FAA risk criteria of 1 accident in billion flight hours. This paper also surveys and analyzes parameters for Jet A fuel ignition in HCWT's. Because of the paucity of in-flight HCWT explosions, we conclude that the intersection of the parameters necessary and sufficient to result in an HCWT explosion with sufficient overpressure to rupture the HCWT is extremely rare.

  14. Characterizing fault-plume intersection probability for geologic carbon sequestration risk assessment

    SciTech Connect

    Jordan, Preston D.; Oldenburg, Curtis M.; Nicot, Jean-Philippe

    2008-11-01

    Leakage of CO{sub 2} out of the designated storage region via faults is a widely recognized concern for geologic carbon sequestration. The probability of such leakage can be separated into the probability of a plume encountering a fault and the probability of flow along such a fault. In the absence of deterministic fault location information, the first probability can be calculated from regional fault population statistics and modeling of the plume shape and size. In this study, fault statistical parameters were measured or estimated for WESTCARB's Phase III pilot test injection in the San Joaquin Valley, California. Combining CO{sub 2} plume model predictions with estimated fault characteristics resulted in a 3% probability that the CO{sub 2} plume will encounter a fault fully offsetting the 180 m (590 ft) thick seal. The probability of leakage is lower, likely much lower, as faults with this offset are probably low-permeability features in this area.

  15. Differential Survival in Europe and the United States: Estimates Based on Subjective Probabilities of Survival

    PubMed Central

    Delavande, Adeline; Rohwedder, Susann

    2013-01-01

    Cross-country comparisons of differential survival by socioeconomic status (SES) are useful in many domains. Yet, to date, such studies have been rare. Reliably estimating differential survival in a single country has been challenging because it requires rich panel data with a large sample size. Cross-country estimates have proven even more difficult because the measures of SES need to be comparable internationally. We present an alternative method for acquiring information on differential survival by SES. Rather than using observations of actual survival, we relate individuals’ subjective probabilities of survival to SES variables in cross section. To show that subjective survival probabilities are informative proxies for actual survival when estimating differential survival, we compare estimates of differential survival based on actual survival with estimates based on subjective probabilities of survival for the same sample. The results are remarkably similar. We then use this approach to compare differential survival by SES for 10 European countries and the United States. Wealthier people have higher survival probabilities than those who are less wealthy, but the strength of the association differs across countries. Nations with a smaller gradient appear to be Belgium, France, and Italy, while the United States, England, and Sweden appear to have a larger gradient. PMID:22042664

  16. Estimating the posterior probability that genome-wide association findings are true or false

    PubMed Central

    Bukszár, József; McClay, Joseph L.; van den Oord, Edwin J. C. G.

    2009-01-01

    Motivation: A limitation of current methods used to declare significance in genome-wide association studies (GWAS) is that they do not provide clear information about the probability that GWAS findings are true of false. This lack of information increases the chance of false discoveries and may result in real effects being missed. Results: We propose a method to estimate the posterior probability that a marker has (no) effect given its test statistic value, also called the local false discovery rate (FDR), in the GWAS. A critical step involves the estimation the parameters of the distribution of the true alternative tests. For this, we derived and implemented the real maximum likelihood function, which turned out to provide us with significantly more accurate estimates than the widely used mixture model likelihood. Actual GWAS data are used to illustrate properties of the posterior probability estimates empirically. In addition to evaluating individual markers, a variety of applications are conceivable. For instance, posterior probability estimates can be used to control the FDR more precisely than Benjamini–Hochberg procedure. Availability: The codes are freely downloadable from the web site http://www.people.vcu.edu/∼jbukszar. Contact: jbukszar@vcu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19420056

  17. Communicating Environmental Risks: Clarifying the Severity Effect in Interpretations of Verbal Probability Expressions

    ERIC Educational Resources Information Center

    Harris, Adam J. L.; Corner, Adam

    2011-01-01

    Verbal probability expressions are frequently used to communicate risk and uncertainty. The Intergovernmental Panel on Climate Change (IPCC), for example, uses them to convey risks associated with climate change. Given the potential for human action to mitigate future environmental risks, it is important to understand how people respond to these…

  18. Estimated probability of arsenic in groundwater from bedrock aquifers in New Hampshire, 2011

    USGS Publications Warehouse

    Ayotte, Joseph D.; Cahillane, Matthew; Hayes, Laura; Robinson, Keith W.

    2012-01-01

    Probabilities of arsenic occurrence in groundwater from bedrock aquifers at concentrations of 1, 5, and 10 micrograms per liter (µg/L) were estimated during 2011 using multivariate logistic regression. These estimates were developed for use by the New Hampshire Environmental Public Health Tracking Program. About 39 percent of New Hampshire bedrock groundwater was identified as having at least a 50 percent chance of containing an arsenic concentration greater than or equal to 1 µg/L. This compares to about 7 percent of New Hampshire bedrock groundwater having at least a 50 percent chance of containing an arsenic concentration equaling or exceeding 5 µg/L and about 5 percent of the State having at least a 50 percent chance for its bedrock groundwater to contain concentrations at or above 10 µg/L. The southeastern counties of Merrimack, Strafford, Hillsborough, and Rockingham have the greatest potential for having arsenic concentrations above 5 and 10 µg/L in bedrock groundwater. Significant predictors of arsenic in groundwater from bedrock aquifers for all three thresholds analyzed included geologic, geochemical, land use, hydrologic, topographic, and demographic factors. Among the three thresholds evaluated, there were some differences in explanatory variables, but many variables were the same. More than 250 individual predictor variables were assembled for this study and tested as potential predictor variables for the models. More than 1,700 individual measurements of arsenic concentration from a combination of public and private water-supply wells served as the dependent (or predicted) variable in the models. The statewide maps generated by the probability models are not designed to predict arsenic concentration in any single well, but they are expected to provide useful information in areas of the State that currently contain little to no data on arsenic concentration. They also may aid in resource decision making, in determining potential risk for private

  19. Estimating present day extreme water level exceedance probabilities around the coastline of Australia: tropical cyclone-induced storm surges

    NASA Astrophysics Data System (ADS)

    Haigh, Ivan D.; MacPherson, Leigh R.; Mason, Matthew S.; Wijeratne, E. M. S.; Pattiaratchi, Charitha B.; Crompton, Ryan P.; George, Steve

    2014-01-01

    The incidence of major storm surges in the last decade have dramatically emphasized the immense destructive capabilities of extreme water level events, particularly when driven by severe tropical cyclones. Given this risk, it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood and erosion management, engineering and for future land-use planning and to ensure the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. Australia has a long history of coastal flooding from tropical cyclones. Using a novel integration of two modeling techniques, this paper provides the first estimates of present day extreme water level exceedance probabilities around the whole coastline of Australia, and the first estimates that combine the influence of astronomical tides, storm surges generated by both extra-tropical and tropical cyclones, and seasonal and inter-annual variations in mean sea level. Initially, an analysis of tide gauge records has been used to assess the characteristics of tropical cyclone-induced surges around Australia. However, given the dearth (temporal and spatial) of information around much of the coastline, and therefore the inability of these gauge records to adequately describe the regional climatology, an observationally based stochastic tropical cyclone model has been developed to synthetically extend the tropical cyclone record to 10,000 years. Wind and pressure fields derived for these synthetically generated events have then been used to drive a hydrodynamic model of the Australian continental shelf region with annual maximum water levels extracted to estimate exceedance probabilities around the coastline. To validate this methodology, selected historic storm surge events have been simulated and resultant storm surges compared with gauge records. Tropical cyclone induced exceedance probabilities have been combined with

  20. Dynamic cost risk estimation and budget misspecification

    NASA Technical Reports Server (NTRS)

    Ebbeler, D. H.; Fox, G.; Habib-Agahi, H.

    2003-01-01

    Cost risk for new technology development is estimated by explicit stochastic processes. Monte Carlo simulation is used to propagate technology development activity budget changes during the technology development cycle.

  1. Estimating probability densities from short samples: A parametric maximum likelihood approach

    NASA Astrophysics Data System (ADS)

    Dudok de Wit, T.; Floriani, E.

    1998-10-01

    A parametric method similar to autoregressive spectral estimators is proposed to determine the probability density function (PDF) of a random set. The method proceeds by maximizing the likelihood of the PDF, yielding estimates that perform equally well in the tails as in the bulk of the distribution. It is therefore well suited for the analysis of short sets drawn from smooth PDF's and stands out by the simplicity of its computational scheme. Its advantages and limitations are discussed.

  2. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    NASA Astrophysics Data System (ADS)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based

  3. Simple indicator kriging for estimating the probability of incorrectly delineating hazardous areas in a contaminated site

    SciTech Connect

    Juang, K.W.; Lee, D.Y.

    1998-09-01

    The probability of incorrectly delineating hazardous areas in a contaminated site is very important for decision-makers because it indicates the magnitude of confidence that decision-makers have in determining areas in need of remediation. In this study, simple indicator kriging (SIK) was used to estimate the probability of incorrectly delineating hazardous areas in a heavy metal-contaminated site, which is located at Taoyuan, Taiwan, and is about 10 ha in area. In the procedure, the values 0 and 1 were assigned to be the stationary means of the indicator codes in the SIK model to represent two hypotheses, hazardous and safe, respectively. The spatial distribution of the conditional probability of heavy metal concentrations lower than a threshold, given each hypothesis, was estimated using SIK. Then, the probabilities of false positives ({alpha}) (i.e., the probability of declaring a location hazardous when it is not) and false negatives ({beta}) (i.e., the probability of declaring a location safe when it is not) in delineating hazardous areas for the heavy metal-contaminated site could be obtained. The spatial distribution of the probabilities of false positives and false negatives could help in delineating hazardous areas based on a tolerable probability level of incorrect delineation. In addition, delineation complicated by the cost of remediation, hazards in the environment, and hazards to human health could be made based on the minimum values of {alpha} and {beta}. The results suggest that the proposed SIK procedure is useful for decision-makers who need to delineate hazardous areas in a heavy metal-contaminated site.

  4. Uranium mill tailings and risk estimation

    SciTech Connect

    Marks, S.

    1984-04-01

    Work done in estimating projected health effects for persons exposed to mill tailings at vicinity properties is described. The effect of the reassessment of exposures at Hiroshima and Nagasaki on the risk estimates for gamma radiation is discussed. A presentation of current results in the epidemiological study of Hanford workers is included. 2 references. (ACR)

  5. Estimating site occupancy and detection probability parameters for meso- and large mammals in a coastal eosystem

    USGS Publications Warehouse

    O'Connell, A.F., Jr.; Talancy, N.W.; Bailey, L.L.; Sauer, J.R.; Cook, R.; Gilbert, A.T.

    2006-01-01

    Large-scale, multispecies monitoring programs are widely used to assess changes in wildlife populations but they often assume constant detectability when documenting species occurrence. This assumption is rarely met in practice because animal populations vary across time and space. As a result, detectability of a species can be influenced by a number of physical, biological, or anthropogenic factors (e.g., weather, seasonality, topography, biological rhythms, sampling methods). To evaluate some of these influences, we estimated site occupancy rates using species-specific detection probabilities for meso- and large terrestrial mammal species on Cape Cod, Massachusetts, USA. We used model selection to assess the influence of different sampling methods and major environmental factors on our ability to detect individual species. Remote cameras detected the most species (9), followed by cubby boxes (7) and hair traps (4) over a 13-month period. Estimated site occupancy rates were similar among sampling methods for most species when detection probabilities exceeded 0.15, but we question estimates obtained from methods with detection probabilities between 0.05 and 0.15, and we consider methods with lower probabilities unacceptable for occupancy estimation and inference. Estimated detection probabilities can be used to accommodate variation in sampling methods, which allows for comparison of monitoring programs using different protocols. Vegetation and seasonality produced species-specific differences in detectability and occupancy, but differences were not consistent within or among species, which suggests that our results should be considered in the context of local habitat features and life history traits for the target species. We believe that site occupancy is a useful state variable and suggest that monitoring programs for mammals using occupancy data consider detectability prior to making inferences about species distributions or population change.

  6. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    USGS Publications Warehouse

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  7. Empirical estimation of the conditional probability of natech events within the United States.

    PubMed

    Santella, Nicholas; Steinberg, Laura J; Aguirra, Gloria Andrea

    2011-06-01

    Natural disasters are the cause of a sizeable number of hazmat releases, referred to as "natechs." An enhanced understanding of natech probability, allowing for predictions of natech occurrence, is an important step in determining how industry and government should mitigate natech risk. This study quantifies the conditional probabilities of natechs at TRI/RMP and SICS 1311 facilities given the occurrence of hurricanes, earthquakes, tornadoes, and floods. During hurricanes, a higher probability of releases was observed due to storm surge (7.3 releases per 100 TRI/RMP facilities exposed vs. 6.2 for SIC 1311) compared to category 1-2 hurricane winds (5.6 TRI, 2.6 SIC 1311). Logistic regression confirms the statistical significance of the greater propensity for releases at RMP/TRI facilities, and during some hurricanes, when controlling for hazard zone. The probability of natechs at TRI/RMP facilities during earthquakes increased from 0.1 releases per 100 facilities at MMI V to 21.4 at MMI IX. The probability of a natech at TRI/RMP facilities within 25 miles of a tornado was small (∼0.025 per 100 facilities), reflecting the limited area directly affected by tornadoes. Areas inundated during flood events had a probability of 1.1 releases per 100 facilities but demonstrated widely varying natech occurrence during individual events, indicating that factors not quantified in this study such as flood depth and speed are important for predicting flood natechs. These results can inform natech risk analysis, aid government agencies responsible for planning response and remediation after natural disasters, and should be useful in raising awareness of natech risk within industry. PMID:21231945

  8. Estimating site occupancy rates when detection probabilities are less than one

    USGS Publications Warehouse

    MacKenzie, D.I.; Nichols, J.D.; Lachman, G.B.; Droege, S.; Royle, J. Andrew; Langtimm, C.A.

    2002-01-01

    Nondetection of a species at a site does not imply that the species is absent unless the probability of detection is 1. We propose a model and likelihood-based method for estimating site occupancy rates when detection probabilities are 0.3). We estimated site occupancy rates for two anuran species at 32 wetland sites in Maryland, USA, from data collected during 2000 as part of an amphibian monitoring program, Frogwatch USA. Site occupancy rates were estimated as 0.49 for American toads (Bufo americanus), a 44% increase over the proportion of sites at which they were actually observed, and as 0.85 for spring peepers (Pseudacris crucifer), slightly above the observed proportion of 0.83.

  9. Submarine tower escape decompression sickness risk estimation.

    PubMed

    Loveman, G A M; Seddon, E M; Thacker, J C; Stansfield, M R; Jurd, K M

    2014-01-01

    Actions to enhance survival in a distressed submarine (DISSUB) scenario may be guided in part by knowledge of the likely risk of decompression sickness (DCS) should the crew attempt tower escape. A mathematical model for DCS risk estimation has been calibrated against DCS outcome data from 3,738 exposures of either men or goats to raised pressure. Body mass was used to scale DCS risk. The calibration data included more than 1,000 actual or simulated submarine escape exposures and no exposures with substantial staged decompression. Cases of pulmonary barotrauma were removed from the calibration data. The calibrated model was used to estimate the likelihood of DCS occurrence following submarine escape from the United Kingdom Royal Navy tower escape system. Where internal DISSUB pressure remains at - 0.1 MPa, escape from DISSUB depths < 200 meters is estimated to have DCS risk < 6%. Saturation at raised DISSUB pressure markedly increases risk, with > 60% DCS risk predicted for a 200-meter escape from saturation at 0.21 MPa. Using the calibrated model to predict DCS for direct ascent from saturation gives similar risk estimates to other published models. PMID:25109085

  10. On the Estimation of Detection Probabilities for Sampling Stream-Dwelling Fishes.

    SciTech Connect

    Peterson, James T.

    1999-11-01

    To examine the adequacy of fish probability of detection estimates, I examined distributional properties of survey and monitoring data for bull trout (Salvelinus confluentus), brook trout (Salvelinus fontinalis), westslope cutthroat trout (Oncorhynchus clarki lewisi), chinook salmon parr (Oncorhynchus tshawytscha), and steelhead /redband trout (Oncorhynchus mykiss spp.), from 178 streams in the Interior Columbia River Basin. Negative binomial dispersion parameters varied considerably among species and streams, but were significantly (P<0.05) positively related to fish density. Across streams, the variances in fish abundances differed greatly among species and indicated that the data for all species were overdispersed with respect to the Poisson (i.e., the variances exceeded the means). This significantly affected Poisson probability of detection estimates, which were the highest across species and were, on average, 3.82, 2.66, and 3.47 times greater than baseline values. Required sample sizes for species detection at the 95% confidence level were also lowest for the Poisson, which underestimated sample size requirements an average of 72% across species. Negative binomial and Poisson-gamma probability of detection and sample size estimates were more accurate than the Poisson and generally less than 10% from baseline values. My results indicate the Poisson and binomial assumptions often are violated, which results in probability of detection estimates that are biased high and sample size estimates that are biased low. To increase the accuracy of these estimates, I recommend that future studies use predictive distributions than can incorporate multiple sources of uncertainty or excess variance and that all distributional assumptions be explicitly tested.

  11. Population-based absolute risk estimation with survey data.

    PubMed

    Kovalchik, Stephanie A; Pfeiffer, Ruth M

    2014-04-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  12. Effects of prior detections on estimates of detection probability, abundance, and occupancy

    USGS Publications Warehouse

    Riddle, Jason D.; Mordecai, Rua S.; Pollock, Kenneth H.; Simons, Theodore R.

    2010-01-01

    Survey methods that account for detection probability often require repeated detections of individual birds or repeated visits to a site to conduct Counts or collect presence-absence data. Initial encounters with individual species or individuals of a species could influence detection probabilities for subsequent encounters. For example, observers may be more likely to redetect a species or individual once they are aware of the presence of that species or individual at a particular site. Not accounting for these effects could result in biased estimators of detection probability, abundance, and occupancy. We tested for effects of prior detections in three data sets that differed dramatically by species, geographic location, and method of counting birds. We found strong support (AIC weights from 83% to 100%) for models that allowed for the effects of prior detections. These models produced estimates of detection probability, abundance, and occupancy that differed substantially from those produced by models that ignored the effects of prior detections. We discuss the consequences of the effects of prior detections on estimation for several sampling methods and provide recommendations for avoiding these effects through survey design or by modeling them when they cannot be avoided. 

  13. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  14. Estimating the Upper Limit of Lifetime Probability Distribution, Based on Data of Japanese Centenarians.

    PubMed

    Hanayama, Nobutane; Sibuya, Masaaki

    2016-08-01

    In modern biology, theories of aging fall mainly into two groups: damage theories and programed theories. If programed theories are true, the probability that human beings live beyond a specific age will be zero. In contrast, if damage theories are true, such an age does not exist, and a longevity record will be eventually destroyed. In this article, for examining real state, a special type of binomial model based on the generalized Pareto distribution has been applied to data of Japanese centenarians. From the results, it is concluded that the upper limit of lifetime probability distribution in the Japanese population has been estimated 123 years. PMID:26362439

  15. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers-Part I

    NASA Technical Reports Server (NTRS)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Bayliss, Jon; Ludwig, Larry

    2008-01-01

    Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that has a currently unknown probability associated with it. Due to contact resistance, electrical shorts may not occur at lower voltage levels. In this experiment, we study the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From this data, we can estimate the probability of an electrical short, as a function of voltage, given that a free tin whisker has bridged two adjacent exposed electrical conductors. In addition, three tin whiskers grown from the same Space Shuttle Orbiter card guide used in the aforementioned experiment were cross sectioned and studied using a focused ion beam (FIB).

  16. Using counts to simultaneously estimate abundance and detection probabilities in a salamander community

    USGS Publications Warehouse

    Dodd, C.K., Jr.; Dorazio, R.M.

    2004-01-01

    A critical variable in both ecological and conservation field studies is determining how many individuals of a species are present within a defined sampling area. Labor intensive techniques such as capture-mark-recapture and removal sampling may provide estimates of abundance, but there are many logistical constraints to their widespread application. Many studies on terrestrial and aquatic salamanders use counts as an index of abundance, assuming that detection remains constant while sampling. If this constancy is violated, determination of detection probabilities is critical to the accurate estimation of abundance. Recently, a model was developed that provides a statistical approach that allows abundance and detection to be estimated simultaneously from spatially and temporally replicated counts. We adapted this model to estimate these parameters for salamanders sampled over a six vear period in area-constrained plots in Great Smoky Mountains National Park. Estimates of salamander abundance varied among years, but annual changes in abundance did not vary uniformly among species. Except for one species, abundance estimates were not correlated with site covariates (elevation/soil and water pH, conductivity, air and water temperature). The uncertainty in the estimates was so large as to make correlations ineffectual in predicting which covariates might influence abundance. Detection probabilities also varied among species and sometimes among years for the six species examined. We found such a high degree of variation in our counts and in estimates of detection among species, sites, and years as to cast doubt upon the appropriateness of using count data to monitor population trends using a small number of area-constrained survey plots. Still, the model provided reasonable estimates of abundance that could make it useful in estimating population size from count surveys.

  17. Probability discounting of gains and losses: implications for risk attitudes and impulsivity.

    PubMed

    Shead, N Will; Hodgins, David C

    2009-07-01

    Sixty college students performed three discounting tasks: probability discounting of gains, probability discounting of losses, and delay discounting of gains. Each task used an adjusting-amount procedure, and participants' choices affected the amount and timing of their remuneration for participating. Both group and individual discounting functions from all three procedures were well fitted by hyperboloid discounting functions. A negative correlation between the probability discounting of gains and losses was observed, consistent with the idea that individuals' choices on probability discounting tasks reflect their general attitude towards risk, regardless of whether the outcomes are gains or losses. This finding further suggests that risk attitudes reflect the weighting an individual gives to the lowest-valued outcome (e.g., getting nothing when the probabilistic outcome is a gain or actually losing when the probabilistic outcome is a loss). According to this view, risk-aversion indicates a tendency to overweight the lowest-valued outcome, whereas risk-seeking indicates a tendency to underweight it. Neither probability discounting of gains nor probability discounting of losses were reliably correlated with discounting of delayed gains, a result that is inconsistent with the idea that probability discounting and delay discounting both reflect a general tendency towards impulsivity. PMID:20119519

  18. Estimation of submarine mass failure probability from a sequence of deposits with age dates

    USGS Publications Warehouse

    Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.

    2013-01-01

    The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.

  19. Estimating the absolute position of a mobile robot using position probability grids

    SciTech Connect

    Burgard, W.; Fox, D.; Hennig, D.; Schmidt, T.

    1996-12-31

    In order to re-use existing models of the environment mobile robots must be able to estimate their position and orientation in such models. Most of the existing methods for position estimation are based on special purpose sensors or aim at tracking the robot`s position relative to the known starting point. This paper describes the position probability grid approach to estimating the robot`s absolute position and orientation in a metric model of the environment. Our method is designed to work with standard sensors and is independent of any knowledge about the starting point. It is a Bayesian approach based on certainty grids. In each cell of such a grid we store the probability that this cell refers to the current position of the robot. These probabilities are obtained by integrating the likelihoods of sensor readings over time. Results described in this paper show that our technique is able to reliably estimate the position of a robot in complex environments. Our approach has proven to be robust with respect to inaccurate environmental models, noisy sensors, and ambiguous situations.

  20. Estimating the probability density of the scattering cross section from Rayleigh scattering experiments

    NASA Astrophysics Data System (ADS)

    Hengartner, Nicolas; Talbot, Lawrence; Shepherd, Ian; Bickel, Peter

    1995-06-01

    An important parameter in the experimental study of dynamics of combustion is the probability distribution of the effective Rayleigh scattering cross section. This cross section cannot be observed directly. Instead, pairs of measurements of laser intensities and Rayleigh scattering counts are observed. Our aim is to provide estimators for the probability density function of the scattering cross section from such measurements. The probability distribution is derived first for the number of recorded photons in the Rayleigh scattering experiment. In this approach the laser intensity measurements are treated as known covariates. This departs from the usual practice of normalizing the Rayleigh scattering counts by the laser intensities. For distributions supported on finite intervals two one based on expansion of the density in

  1. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry

    2009-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  2. Probability based remaining capacity estimation using data-driven and neural network model

    NASA Astrophysics Data System (ADS)

    Wang, Yujie; Yang, Duo; Zhang, Xu; Chen, Zonghai

    2016-05-01

    Since large numbers of lithium-ion batteries are composed in pack and the batteries are complex electrochemical devices, their monitoring and safety concerns are key issues for the applications of battery technology. An accurate estimation of battery remaining capacity is crucial for optimization of the vehicle control, preventing battery from over-charging and over-discharging and ensuring the safety during its service life. The remaining capacity estimation of a battery includes the estimation of state-of-charge (SOC) and state-of-energy (SOE). In this work, a probability based adaptive estimator is presented to obtain accurate and reliable estimation results for both SOC and SOE. For the SOC estimation, an n ordered RC equivalent circuit model is employed by combining an electrochemical model to obtain more accurate voltage prediction results. For the SOE estimation, a sliding window neural network model is proposed to investigate the relationship between the terminal voltage and the model inputs. To verify the accuracy and robustness of the proposed model and estimation algorithm, experiments under different dynamic operation current profiles are performed on the commercial 1665130-type lithium-ion batteries. The results illustrate that accurate and robust estimation can be obtained by the proposed method.

  3. Estimating migratory connectivity of birds when re-encounter probabilities are heterogeneous

    USGS Publications Warehouse

    Cohen, Emily B.; Hostelter, Jeffrey A.; Royle, J. Andrew; Marra, Peter P.

    2014-01-01

    Understanding the biology and conducting effective conservation of migratory species requires an understanding of migratory connectivity – the geographic linkages of populations between stages of the annual cycle. Unfortunately, for most species, we are lacking such information. The North American Bird Banding Laboratory (BBL) houses an extensive database of marking, recaptures and recoveries, and such data could provide migratory connectivity information for many species. To date, however, few species have been analyzed for migratory connectivity largely because heterogeneous re-encounter probabilities make interpretation problematic. We accounted for regional variation in re-encounter probabilities by borrowing information across species and by using effort covariates on recapture and recovery probabilities in a multistate capture–recapture and recovery model. The effort covariates were derived from recaptures and recoveries of species within the same regions. We estimated the migratory connectivity for three tern species breeding in North America and over-wintering in the tropics, common (Sterna hirundo), roseate (Sterna dougallii), and Caspian terns (Hydroprogne caspia). For western breeding terns, model-derived estimates of migratory connectivity differed considerably from those derived directly from the proportions of re-encounters. Conversely, for eastern breeding terns, estimates were merely refined by the inclusion of re-encounter probabilities. In general, eastern breeding terns were strongly connected to eastern South America, and western breeding terns were strongly linked to the more western parts of the nonbreeding range under both models. Through simulation, we found this approach is likely useful for many species in the BBL database, although precision improved with higher re-encounter probabilities and stronger migratory connectivity. We describe an approach to deal with the inherent biases in BBL banding and re-encounter data to demonstrate

  4. Developing an Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish.

  5. [Estimation of risk areas for hepatitis A].

    PubMed

    Braga, Ricardo Cerqueira Campos; Valencia, Luís Iván Ortiz; Medronho, Roberto de Andrade; Escosteguy, Claudia Caminha

    2008-08-01

    This study estimated hepatitis A risk areas in a region of Duque de Caxias, Rio de Janeiro State, Brazil. A cross-sectional study consisting of a hepatitis A serological survey and a household survey were conducted in 19 census tracts. Of these, 11 tracts were selected and 1,298 children from one to ten years of age were included in the study. Geostatistical techniques allowed modeling the spatial continuity of hepatitis A, non-use of filtered drinking water, time since installation of running water, and number of water taps per household and their spatial estimation through ordinary and indicator kriging. Adjusted models for the outcome and socioeconomic variables were isotropic; risk maps were constructed; cross-validation of the four models was satisfactory. Spatial estimation using the kriging method detected areas with increased risk of hepatitis A, independently of the urban administrative area in which the census tracts were located. PMID:18709215

  6. How should detection probability be incorporated into estimates of relative abundance?

    USGS Publications Warehouse

    MacKenzie, D.I.; Kendall, W.L.

    2002-01-01

    Determination of the relative abundance of two populations, separated by time or space, is of interest in many ecological situations. We focus on two estimators of relative abundance, which assume that the probability that an individual is detected at least once in the survey is either equal or unequal for the two populations. We present three methods for incorporating the collected information into our inference. The first method, proposed previously, is a traditional hypothesis test for evidence that detection probabilities are unequal. However, we feel that, a priori, it is more likely that detection probabilities are actually different; hence, the burden of proof should be shifted, requiring evidence that detection probabilities are practically equivalent. The second method we present, equivalence testing, is one approach to doing so. Third, we suggest that model averaging could be used by combining the two estimators according to derived model weights. These differing approaches are applied to a mark-recapture experiment on Nuttail's cottontail rabbit (Sylvilagus nuttallii) conducted in central Oregon during 1974 and 1975, which has been previously analyzed by other authors.

  7. Estimating conditional probability of volcanic flows for forecasting event distribution and making evacuation decisions

    NASA Astrophysics Data System (ADS)

    Stefanescu, E. R.; Patra, A.; Sheridan, M. F.; Cordoba, G.

    2012-04-01

    In this study we propose a conditional probability framework for Galeras volcano, which is one of the most active volcanoes on the world. Nearly 400,000 people currently live near the volcano; 10,000 of them reside within the zone of high volcanic hazard. Pyroclastic flows pose a major hazard for this population. Some of the questions we try to answer when studying conditional probabilities for volcanic hazards are: "Should a village be evacuated and villagers moved to a different location?", "Should we construct a road along this valley or along a different one?", "Should this university be evacuated?" Here, we try to identify critical regions such as villages, infrastructures, cities, university to determine their relative probability of inundation in case of an volcanic eruption. In this study, a set of numerical simulation were performed using a computational tool TITAN2D which simulates granular flow over digital representation of the natural terrain. The particular choice from among the methods described below can be based on the amount of information necessary in the evacuation decision and on the complexity of the analysis required in taking such decision. A set of 4200 TITAN2D runs were performed for several different location so that the area of all probably vents is covered. The output of the geophysical model provides a flow map which contains the maximum flow depth over time. Frequency approach - In estimating the conditional probability of volcanic flows we define two discrete random variables (r.v.) A and B, where P(A =1) and P(B=1) represents the probability of having a flow at location A, and B, respectively. For this analysis we choose two critical locations identified by their UTM coordinates. The flow map is then used in identifying at the pixel level, flow or non-flow at the two locations. By counting the number of times there is flow or non-flow, we are able to find the marginal probabilities along with the joint probability associated with an

  8. Limitation of Inverse Probability-of-Censoring Weights in Estimating Survival in the Presence of Strong Selection Bias

    PubMed Central

    Howe, Chanelle J.; Cole, Stephen R.; Chmiel, Joan S.; Muñoz, Alvaro

    2011-01-01

    In time-to-event analyses, artificial censoring with correction for induced selection bias using inverse probability-of-censoring weights can be used to 1) examine the natural history of a disease after effective interventions are widely available, 2) correct bias due to noncompliance with fixed or dynamic treatment regimens, and 3) estimate survival in the presence of competing risks. Artificial censoring entails censoring participants when they meet a predefined study criterion, such as exposure to an intervention, failure to comply, or the occurrence of a competing outcome. Inverse probability-of-censoring weights use measured common predictors of the artificial censoring mechanism and the outcome of interest to determine what the survival experience of the artificially censored participants would be had they never been exposed to the intervention, complied with their treatment regimen, or not developed the competing outcome. Even if all common predictors are appropriately measured and taken into account, in the context of small sample size and strong selection bias, inverse probability-of-censoring weights could fail because of violations in assumptions necessary to correct selection bias. The authors used an example from the Multicenter AIDS Cohort Study, 1984–2008, regarding estimation of long-term acquired immunodeficiency syndrome-free survival to demonstrate the impact of violations in necessary assumptions. Approaches to improve correction methods are discussed. PMID:21289029

  9. PIGS: improved estimates of identity-by-descent probabilities by probabilistic IBD graph sampling.

    PubMed

    Park, Danny S; Baran, Yael; Hormozdiari, Farhad; Eng, Celeste; Torgerson, Dara G; Burchard, Esteban G; Zaitlen, Noah

    2015-01-01

    Identifying segments in the genome of different individuals that are identical-by-descent (IBD) is a fundamental element of genetics. IBD data is used for numerous applications including demographic inference, heritability estimation, and mapping disease loci. Simultaneous detection of IBD over multiple haplotypes has proven to be computationally difficult. To overcome this, many state of the art methods estimate the probability of IBD between each pair of haplotypes separately. While computationally efficient, these methods fail to leverage the clique structure of IBD resulting in less powerful IBD identification, especially for small IBD segments. PMID:25860540

  10. Weighted least square estimates of the parameters of a model of survivorship probabilities.

    PubMed

    Mitra, S

    1987-06-01

    "A weighted regression has been fitted to estimate the parameters of a model involving functions of survivorship probability and age. Earlier, the parameters were estimated by the method of ordinary least squares and the results were very encouraging. However, a multiple regression equation passing through the origin has been found appropriate for the present model from statistical consideration. Fortunately, this method, while methodologically more sophisticated, has a slight edge over the former as evidenced by the respective measures of reproducibility in the model and actual life tables selected for this study." PMID:12281212

  11. Survival probabilities with time-dependent treatment indicator: quantities and non-parametric estimators.

    PubMed

    Bernasconi, Davide Paolo; Rebora, Paola; Iacobelli, Simona; Valsecchi, Maria Grazia; Antolini, Laura

    2016-03-30

    The 'landmark' and 'Simon and Makuch' non-parametric estimators of the survival function are commonly used to contrast the survival experience of time-dependent treatment groups in applications such as stem cell transplant versus chemotherapy in leukemia. However, the theoretical survival functions corresponding to the second approach were not clearly defined in the literature, and the use of the 'Simon and Makuch' estimator was criticized in the biostatistical community. Here, we review the 'landmark' approach, showing that it focuses on the average survival of patients conditional on being failure free and on the treatment status assessed at the landmark time. We argue that the 'Simon and Makuch' approach represents counterfactual survival probabilities where treatment status is forced to be fixed: the patient is thought as under chemotherapy without possibility to switch treatment or as under transplant since the beginning of the follow-up. We argue that the 'Simon and Makuch' estimator leads to valid estimates only under the Markov assumption, which is however less likely to occur in practical applications. This motivates the development of a novel approach based on time rescaling, which leads to suitable estimates of the counterfactual probabilities in a semi-Markov process. The method is also extended to deal with a fixed landmark time of interest. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26503800

  12. A robust design mark-resight abundance estimator allowing heterogeneity in resighting probabilities

    USGS Publications Warehouse

    McClintock, B.T.; White, Gary C.; Burnham, K.P.

    2006-01-01

    This article introduces the beta-binomial estimator (BBE), a closed-population abundance mark-resight model combining the favorable qualities of maximum likelihood theory and the allowance of individual heterogeneity in sighting probability (p). The model may be parameterized for a robust sampling design consisting of multiple primary sampling occasions where closure need not be met between primary occasions. We applied the model to brown bear data from three study areas in Alaska and compared its performance to the joint hypergeometric estimator (JHE) and Bowden's estimator (BOWE). BBE estimates suggest heterogeneity levels were non-negligible and discourage the use of JHE for these data. Compared to JHE and BOWE, confidence intervals were considerably shorter for the AICc model-averaged BBE. To evaluate the properties of BBE relative to JHE and BOWE when sample sizes are small, simulations were performed with data from three primary occasions generated under both individual heterogeneity and temporal variation in p. All models remained consistent regardless of levels of variation in p. In terms of precision, the AICc model-averaged BBE showed advantages over JHE and BOWE when heterogeneity was present and mean sighting probabilities were similar between primary occasions. Based on the conditions examined, BBE is a reliable alternative to JHE or BOWE and provides a framework for further advances in mark-resight abundance estimation. ?? 2006 American Statistical Association and the International Biometric Society.

  13. Efficient estimation of contact probabilities from inter-bead distance distributions in simulated polymer chains

    NASA Astrophysics Data System (ADS)

    Meluzzi, Dario; Arya, Gaurav

    2015-02-01

    The estimation of contact probabilities (CP) from conformations of simulated bead-chain polymer models is a key step in methods that aim to elucidate the spatial organization of chromatin from analysis of experimentally determined contacts between different genomic loci. Although CPs can be estimated simply by counting contacts between beads in a sample of simulated chain conformations, reliable estimation of small CPs through this approach requires a large number of conformations, which can be computationally expensive to obtain. Here we describe an alternative computational method for estimating relatively small CPs without requiring large samples of chain conformations. In particular, we estimate the CPs from functional approximations to the cumulative distribution function (cdf) of the inter-bead distance for each pair of beads. These cdf approximations are obtained by fitting the extended generalized lambda distribution (EGLD) to inter-bead distances determined from a sample of chain conformations, which are in turn generated by Monte Carlo simulations. We find that CPs estimated from fitted EGLD cdfs are significantly more accurate than CPs estimated using contact counts from samples of limited size, and are more precise with all sample sizes, permitting as much as a tenfold reduction in conformation sample size for chains of 200 beads and samples smaller than 105 conformations. This method of CP estimation thus has potential to accelerate computational efforts to elucidate the spatial organization of chromatin.

  14. Efficient estimation of contact probabilities from inter-bead distance distributions in simulated polymer chains.

    PubMed

    Meluzzi, Dario; Arya, Gaurav

    2015-02-18

    The estimation of contact probabilities (CP) from conformations of simulated bead-chain polymer models is a key step in methods that aim to elucidate the spatial organization of chromatin from analysis of experimentally determined contacts between different genomic loci. Although CPs can be estimated simply by counting contacts between beads in a sample of simulated chain conformations, reliable estimation of small CPs through this approach requires a large number of conformations, which can be computationally expensive to obtain. Here we describe an alternative computational method for estimating relatively small CPs without requiring large samples of chain conformations. In particular, we estimate the CPs from functional approximations to the cumulative distribution function (cdf) of the inter-bead distance for each pair of beads. These cdf approximations are obtained by fitting the extended generalized lambda distribution (EGLD) to inter-bead distances determined from a sample of chain conformations, which are in turn generated by Monte Carlo simulations. We find that CPs estimated from fitted EGLD cdfs are significantly more accurate than CPs estimated using contact counts from samples of limited size, and are more precise with all sample sizes, permitting as much as a tenfold reduction in conformation sample size for chains of 200 beads and samples smaller than 10(5) conformations. This method of CP estimation thus has potential to accelerate computational efforts to elucidate the spatial organization of chromatin. PMID:25563926

  15. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  16. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-04-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  17. Estimating state-transition probabilities for unobservable states using capture-recapture/resighting data

    USGS Publications Warehouse

    Kendall, W.L.; Nichols, J.D.

    2002-01-01

    Temporary emigration was identified some time ago as causing potential problems in capture-recapture studies, and in the last five years approaches have been developed for dealing with special cases of this general problem. Temporary emigration can be viewed more generally as involving transitions to and from an unobservable state, and frequently the state itself is one of biological interest (e.g., 'nonbreeder'). Development of models that permit estimation of relevant parameters in the presence of an unobservable state requires either extra information (e.g., as supplied by Pollock's robust design) or the following classes of model constraints: reducing the order of Markovian transition probabilities, imposing a degree of determinism on transition probabilities, removing state specificity of survival probabilities, and imposing temporal constancy of parameters. The objective of the work described in this paper is to investigate estimability of model parameters under a variety of models that include an unobservable state. Beginning with a very general model and no extra information, we used numerical methods to systematically investigate the use of ancillary information and constraints to yield models that are useful for estimation. The result is a catalog of models for which estimation is possible. An example analysis of sea turtle capture-recapture data under two different models showed similar point estimates but increased precision for the model that incorporated ancillary data (the robust design) when compared to the model with deterministic transitions only. This comparison and the results of our numerical investigation of model structures lead to design suggestions for capture-recapture studies in the presence of an unobservable state.

  18. Calculation of the number of Monte Carlo histories for a planetary protection probability of impact estimation

    NASA Astrophysics Data System (ADS)

    Barengoltz, Jack

    2016-07-01

    Monte Carlo (MC) is a common method to estimate probability, effectively by a simulation. For planetary protection, it may be used to estimate the probability of impact P{}_{I} by a launch vehicle (upper stage) of a protected planet. The object of the analysis is to provide a value for P{}_{I} with a given level of confidence (LOC) that the true value does not exceed the maximum allowed value of P{}_{I}. In order to determine the number of MC histories required, one must also guess the maximum number of hits that will occur in the analysis. This extra parameter is needed because a LOC is desired. If more hits occur, the MC analysis would indicate that the true value may exceed the specification value with a higher probability than the LOC. (In the worst case, even the mean value of the estimated P{}_{I} might exceed the specification value.) After the analysis is conducted, the actual number of hits is, of course, the mean. The number of hits arises from a small probability per history and a large number of histories; these are the classic requirements for a Poisson distribution. For a known Poisson distribution (the mean is the only parameter), the probability for some interval in the number of hits is calculable. Before the analysis, this is not possible. Fortunately, there are methods that can bound the unknown mean for a Poisson distribution. F. Garwoodfootnote{ F. Garwood (1936), ``Fiduciary limits for the Poisson distribution.'' Biometrika 28, 437-442.} published an appropriate method that uses the Chi-squared function, actually its inversefootnote{ The integral chi-squared function would yield probability α as a function of the mean µ and an actual value n.} (despite the notation used): This formula for the upper and lower limits of the mean μ with the two-tailed probability 1-α depends on the LOC α and an estimated value of the number of "successes" n. In a MC analysis for planetary protection, only the upper limit is of interest, i.e., the single

  19. Spatial ascariasis risk estimation using socioeconomic variables.

    PubMed

    Valencia, Luis Iván Ortiz; Fortes, Bruno de Paula Menezes Drumond; Medronho, Roberto de Andrade

    2005-12-01

    Frequently, disease incidence is mapped as area data, for example, census tracts, districts or states. Spatial disease incidence can be highly heterogeneous inside these areas. Ascariasis is a highly prevalent disease, which is associated with poor sanitation and hygiene. Geostatistics was applied to model spatial distribution of Ascariasis risk and socioeconomic risk events in a poor community in Rio de Janeiro, Brazil. Data were gathered from a coproparasitologic and a domiciliary survey in 1550 children aged 1-9. Ascariasis risk and socioeconomic risk events were spatially estimated using Indicator Kriging. Cokriging models with a Linear Model of Coregionalization incorporating one socioeconomic variable were implemented. If a housewife attended school for less than four years, the non-use of a home water filter, a household density greater than one, and a household income lower than one Brazilian minimum wage increased the risk of Ascariasis. Cokriging improved spatial estimation of Ascariasis risk areas when compared to Indicator Kriging and detected more Ascariasis very-high risk areas than the GIS Overlay method. PMID:16506435

  20. A Method for Estimating the Probability of Floating Gate Prompt Charge Loss in a Radiation Environment

    NASA Technical Reports Server (NTRS)

    Edmonds, L. D.

    2016-01-01

    Because advancing technology has been producing smaller structures in electronic circuits, the floating gates in modern flash memories are becoming susceptible to prompt charge loss from ionizing radiation environments found in space. A method for estimating the risk of a charge-loss event is given.

  1. A Method for Estimating the Probability of Floating Gate Prompt Charge Loss in a Radiation Environment

    NASA Technical Reports Server (NTRS)

    Edmonds, L. D.

    2016-01-01

    Since advancing technology has been producing smaller structures in electronic circuits, the floating gates in modern flash memories are becoming susceptible to prompt charge loss from ionizing radiation environments found in space. A method for estimating the risk of a charge-loss event is given.

  2. IMPROVED RISK ESTIMATES FOR CARBON TETRACHLORIDE

    EPA Science Inventory

    Carbon tetrachloride (CCl4) has been used extensively within the Department of Energy (DOE) nuclear weapons facilities. Costs associated with cleanup of CCl4 at DOE facilities are driven by current cancer risk estimates which assume CCl4 is a genotoxic carcinogen. However, a grow...

  3. Innovative Meta-Heuristic Approach Application for Parameter Estimation of Probability Distribution Model

    NASA Astrophysics Data System (ADS)

    Lee, T. S.; Yoon, S.; Jeong, C.

    2012-12-01

    The primary purpose of frequency analysis in hydrology is to estimate the magnitude of an event with a given frequency of occurrence. The precision of frequency analysis depends on the selection of an appropriate probability distribution model (PDM) and parameter estimation techniques. A number of PDMs have been developed to describe the probability distribution of the hydrological variables. For each of the developed PDMs, estimated parameters are provided based on alternative estimation techniques, such as the method of moments (MOM), probability weighted moments (PWM), linear function of ranked observations (L-moments), and maximum likelihood (ML). Generally, the results using ML are more reliable than the other methods. However, the ML technique is more laborious than the other methods because an iterative numerical solution, such as the Newton-Raphson method, must be used for the parameter estimation of PDMs. In the meantime, meta-heuristic approaches have been developed to solve various engineering optimization problems (e.g., linear and stochastic, dynamic, nonlinear). These approaches include genetic algorithms, ant colony optimization, simulated annealing, tabu searches, and evolutionary computation methods. Meta-heuristic approaches use a stochastic random search instead of a gradient search so that intricate derivative information is unnecessary. Therefore, the meta-heuristic approaches have been shown to be a useful strategy to solve optimization problems in hydrology. A number of studies focus on using meta-heuristic approaches for estimation of hydrological variables with parameter estimation of PDMs. Applied meta-heuristic approaches offer reliable solutions but use more computation time than derivative-based methods. Therefore, the purpose of this study is to enhance the meta-heuristic approach for the parameter estimation of PDMs by using a recently developed algorithm known as a harmony search (HS). The performance of the HS is compared to the

  4. Secondary prevention and estimation of fracture risk.

    PubMed

    Mitchell, Paul James; Chem, C

    2013-12-01

    The key questions addressed in this chapter are: • How can individual risk of fracture be best estimated? • What is the best system to prevent a further fracture? • How to implement systems for preventing further fractures? Absolute fracture risk calculators (FRCs) provide a means to estimate an individual's future fracture risk. FRCs are widely available and provide clinicians and patients a platform to discuss the need for intervention to prevent fragility fractures. Despite availability of effective osteoporosis medicines for almost two decades, most patients presenting with new fragility fractures do not receive secondary preventive care. The Fracture Liaison Service (FLS) model has been shown in a number of countries to eliminate the care gap in a clinically and cost-effective manner. Leading international and national organisations have developed comprehensive resources and/or national strategy documents to provide guidance on implementation of FLS in local, regional and national health-care systems. PMID:24836336

  5. An estimate of the probability of capture of a binary star by a supermassive black hole

    NASA Astrophysics Data System (ADS)

    Dremova, G. N.; Dremov, V. V.; Tutukov, A. V.

    2016-08-01

    A simple model for the dynamics of stars located in a sphere with a radius of one-tenth of the central parsec, designed to enable estimation of the probability of capture in the close vicinity ( r < 10-3 pc) of a supermassive black hole (SMBH) is presented. In the case of binary stars, such a capture with a high probability results in the formation of a hyper-velocity star. The population of stars in a sphere of radius <0.1 pc is calculated based on data for the Galactic rotation curve. To simulate the distortion of initially circular orbits of stars, these are subjected to a series of random shock encounters ("kicks"), whose net effect is to "push" these binary systems into the region of potential formation of hyper-velocity stars. The mean crossing time of the border of the close vicinity of the SMBH ( r < 10-3 pc) by the stellar orbit can be used to estimate the probability that a binary system is captured, followed by the possible ejection of a hyper-velocity star.

  6. Empirical comparison of uniform and non-uniform probability sampling for estimating numbers of red-cockaded woodpecker colonies

    USGS Publications Warehouse

    Geissler, P.H.; Moyer, L.M.

    1983-01-01

    Four sampling and estimation methods for estimating the number of red-cockaded woodpecker colonies on National Forests in the Southeast were compared, using samples chosen from simulated populations based on the observed sample. The methods included (1) simple random sampling without replacement using a mean per sampling unit estimator, (2) simple random sampling without replacement with a ratio per pine area estimator, (3) probability proportional to 'size' sampling with replacement, and (4) probability proportional to 'size' without replacement using Murthy's estimator. The survey sample of 274 National Forest compartments (1000 acres each) constituted a superpopulation from which simulated stratum populations were selected with probability inversely proportional to the original probability of selection. Compartments were originally sampled with probabilities proportional to the probabilities that the compartments contained woodpeckers ('size'). These probabilities were estimated with a discriminant analysis based on tree species and tree age. The ratio estimator would have been the best estimator for this survey based on the mean square error. However, if more accurate predictions of woodpecker presence had been available, Murthy's estimator would have been the best. A subroutine to calculate Murthy's estimates is included; it is computationally feasible to analyze up to 10 samples per stratum.

  7. Dynamic Estimation of the Probability of Patient Readmission to the ICU using Electronic Medical Records

    PubMed Central

    Caballero, Karla; Akella, Ram

    2015-01-01

    In this paper, we propose a framework to dynamically estimate the probability that a patient is readmitted after he is discharged from the ICU and transferred to a lower level care. We model this probability as a latent state which evolves over time using Dynamical Linear Models (DLM). We use as an input a combination of numerical and text features obtained from the patient Electronic Medical Records (EMRs). We process the text from the EMRs to capture different diseases, symptoms and treatments by means of noun phrases and ontologies. We also capture the global context of each text entry using Statistical Topic Models. We fill out the missing values using a Expectation Maximization based method (EM). Experimental results show that our method outperforms other methods in the literature terms of AUC, sensitivity and specificity. In addition, we show that the combination of different features (numerical and text) increases the prediction performance of the proposed approach. PMID:26958282

  8. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  9. Estimating survival and breeding probability for pond-breeding amphibians: a modified robust design

    USGS Publications Warehouse

    Bailey, L.L.; Kendall, W.L.; Church, D.R.; Wilbur, H.M.

    2004-01-01

    Many studies of pond-breeding amphibians involve sampling individuals during migration to and from breeding habitats. Interpreting population processes and dynamics from these studies is difficult because (1) only a proportion of the population is observable each season, while an unknown proportion remains unobservable (e.g., non-breeding adults) and (2) not all observable animals are captured. Imperfect capture probability can be easily accommodated in capture?recapture models, but temporary transitions between observable and unobservable states, often referred to as temporary emigration, is known to cause problems in both open- and closed-population models. We develop a multistate mark?recapture (MSMR) model, using an open-robust design that permits one entry and one exit from the study area per season. Our method extends previous temporary emigration models (MSMR with an unobservable state) in two ways. First, we relax the assumption of demographic closure (no mortality) between consecutive (secondary) samples, allowing estimation of within-pond survival. Also, we add the flexibility to express survival probability of unobservable individuals (e.g., ?non-breeders?) as a function of the survival probability of observable animals while in the same, terrestrial habitat. This allows for potentially different annual survival probabilities for observable and unobservable animals. We apply our model to a relictual population of eastern tiger salamanders (Ambystoma tigrinum tigrinum). Despite small sample sizes, demographic parameters were estimated with reasonable precision. We tested several a priori biological hypotheses and found evidence for seasonal differences in pond survival. Our methods could be applied to a variety of pond-breeding species and other taxa where individuals are captured entering or exiting a common area (e.g., spawning or roosting area, hibernacula).

  10. DROPOUT AND RETENTION RATE METHODOLOGY USED TO ESTIMATE FIRST-STAGE ELEMENTS OF THE TRANSITION PROBABILITY MATRICES FOR DYNAMOD II.

    ERIC Educational Resources Information Center

    HUDMAN, JOHN T.; ZABROWSKI, EDWARD K.

    EQUATIONS FOR SYSTEM INTAKE, DROPOUT, AND RETENTION RATE CALCULATIONS ARE DERIVED FOR ELEMENTARY SCHOOLS, SECONDARY SCHOOLS, AND COLLEGES. THE PROCEDURES DESCRIBED WERE FOLLOWED IN DEVELOPING ESTIMATES OF SELECTED ELEMENTS OF THE TRANSITION PROBABILITY MATRICES USED IN DYNAMOD II. THE PROBABILITY MATRIX CELLS ESTIMATED BY THE PROCEDURES DESCRIBED…

  11. Probability density function estimation for characterizing hourly variability of ionospheric total electron content

    NASA Astrophysics Data System (ADS)

    Turel, N.; Arikan, F.

    2010-12-01

    Ionospheric channel characterization is an important task for both HF and satellite communications. The inherent space-time variability of the ionosphere can be observed through total electron content (TEC) that can be obtained using GPS receivers. In this study, within-the-hour variability of the ionosphere over high-latitude, midlatitude, and equatorial regions is investigated by estimating a parametric model for the probability density function (PDF) of GPS-TEC. PDF is a useful tool in defining the statistical structure of communication channels. For this study, a half solar cycle data is collected for 18 GPS stations. Histograms of TEC, corresponding to experimental probability distributions, are used to estimate the parameters of five different PDFs. The best fitting distribution to the TEC data is obtained using the maximum likelihood ratio of the estimated parametric distributions. It is observed that all of the midlatitude stations and most of the high-latitude and equatorial stations are distributed as lognormal. A representative distribution can easily be obtained for stations that are located in midlatitude using solar zenith normalization. The stations located in very high latitudes or in equatorial regions cannot be described using only one PDF distribution. Due to significant seasonal variability, different distributions are required for summer and winter.

  12. Estimating superpopulation size and annual probability of breeding for pond-breeding salamanders

    USGS Publications Warehouse

    Kinkead, K.E.; Otis, D.L.

    2007-01-01

    It has long been accepted that amphibians can skip breeding in any given year, and environmental conditions act as a cue for breeding. In this paper, we quantify temporary emigration or nonbreeding probability for mole and spotted salamanders (Ambystoma talpoideum and A. maculatum). We estimated that 70% of mole salamanders may skip breeding during an average rainfall year and 90% may skip during a drought year. Spotted salamanders may be more likely to breed, with only 17% avoiding the breeding pond during an average rainfall year. We illustrate how superpopulations can be estimated using temporary emigration probability estimates. The superpopulation is the total number of salamanders associated with a given breeding pond. Although most salamanders stay within a certain distance of a breeding pond for the majority of their life spans, it is difficult to determine true overall population sizes for a given site if animals are only captured during a brief time frame each year with some animals unavailable for capture at any time during a given year. ?? 2007 by The Herpetologists' League, Inc.

  13. A logistic regression equation for estimating the probability of a stream in Vermont having intermittent flow

    USGS Publications Warehouse

    Olson, Scott A.; Brouillette, Michael C.

    2006-01-01

    A logistic regression equation was developed for estimating the probability of a stream flowing intermittently at unregulated, rural stream sites in Vermont. These determinations can be used for a wide variety of regulatory and planning efforts at the Federal, State, regional, county and town levels, including such applications as assessing fish and wildlife habitats, wetlands classifications, recreational opportunities, water-supply potential, waste-assimilation capacities, and sediment transport. The equation will be used to create a derived product for the Vermont Hydrography Dataset having the streamflow characteristic of 'intermittent' or 'perennial.' The Vermont Hydrography Dataset is Vermont's implementation of the National Hydrography Dataset and was created at a scale of 1:5,000 based on statewide digital orthophotos. The equation was developed by relating field-verified perennial or intermittent status of a stream site during normal summer low-streamflow conditions in the summer of 2005 to selected basin characteristics of naturally flowing streams in Vermont. The database used to develop the equation included 682 stream sites with drainage areas ranging from 0.05 to 5.0 square miles. When the 682 sites were observed, 126 were intermittent (had no flow at the time of the observation) and 556 were perennial (had flowing water at the time of the observation). The results of the logistic regression analysis indicate that the probability of a stream having intermittent flow in Vermont is a function of drainage area, elevation of the site, the ratio of basin relief to basin perimeter, and the areal percentage of well- and moderately well-drained soils in the basin. Using a probability cutpoint (a lower probability indicates the site has perennial flow and a higher probability indicates the site has intermittent flow) of 0.5, the logistic regression equation correctly predicted the perennial or intermittent status of 116 test sites 85 percent of the time.

  14. Estimating the probability of emigration from individual-specific data: the case of Italy in the early twentieth century.

    PubMed

    Mondschean, T H

    1986-01-01

    "This article develops a method for estimating the probability of emigration conditional on the observed characteristics of individuals. In addition, it is shown how to calculate the mean, standard error, and confidence intervals of the conditional probability of emigration given a random sample of emigrants. The technique is illustrated by providing statistically consistent estimates of the probability an Italian would emigrate to the United States in 1901 and 1911, conditional on personal attributes." PMID:12340643

  15. Estimation of Genotype Distributions and Posterior Genotype Probabilities for β-Mannosidosis in Salers Cattle

    PubMed Central

    Taylor, J. F.; Abbitt, B.; Walter, J. P.; Davis, S. K.; Jaques, J. T.; Ochoa, R. F.

    1993-01-01

    β-Mannosidosis is a lethal lysosomal storage disease inherited as an autosomal recessive in man, cattle and goats. Laboratory assay data of plasma β-mannosidase activity represent a mixture of homozygous normal and carrier genotype distributions in a proportion determined by genotype frequency. A maximum likelihood approach employing data transformations for each genotype distribution and assuming a diallelic model of inheritance is described. Estimates of the transformation and genotype distribution parameters, gene frequency, genotype fitness and carrier probability were obtained simultaneously from a sample of 2,812 observations on U.S. purebred Salers cattle with enzyme activity, age, gender, month of pregnancy, month of testing, and parents identified. Transformations to normality were not required, estimated gene and carrier genotype frequencies of 0.074 and 0.148 were high, and the estimated relative fitness of heterozygotes was 1.36. The apparent overdominance in fitness may be due to a nonrandom sampling of progeny genotypes within families. The mean of plasma enzyme activity was higher for males than females, higher in winter months, lower in summer months and decreased with increased age. Estimates of carrier probabilities indicate that the test is most effective when animals are sampled as calves, although effectiveness of the plasma assay was less for males than females. Test effectiveness was enhanced through averaging repeated assays of enzyme activity on each animal. Our approach contributes to medical diagnostics in several ways. Rather than assume underlying normality for the distributions comprising the mixture, we estimate transformations to normality for each genotype distribution simultaneously with all other model parameters. This process also excludes potential biases due to data preadjustment for systematic effects. We also provide a method for partitioning phenotypic variation within each genotypic distribution which allows an assessment

  16. How to estimate your tolerance for risk

    SciTech Connect

    Mackay, J.A.

    1996-12-31

    Risk tolerance is used to calculate the Risk Adjusted Value (RAV) of a proposed investment. The RAV incorporates both the expected value and risk attitude for a particular investment, taking into consideration your concern for catastrophic financial loss, as well as chance of success, cost and value if successful. Uncertainty can be incorporated into all of the above variables. Often a project is more valuable to a corporation if a partial working interest is taken rather than the entire working interest. The RAV can be used to calculate the optimum working interest and the value of that diversification. To estimate the Apparent Risk Tolerance (ART) of an individual, division or corporation several methods can be employed: (1) ART can be calculated from the working interest selected in prior investment decisions. (2) ART can be estimated from a selection of working interests by the decision maker in a proposed portfolio of projects. (3) ART can be approximated from data released to the Security and Exchange Commission (SEC) in the annual 10K supplements (for both your company and possible partners). (4) ART can be assigned based on corporate size, budget, or activity. Examples are provided for the various methods to identify risk tolerance and apply it in making optimum working interest calculations for individual projects and portfolios.

  17. Risk Estimation Methodology for Launch Accidents.

    SciTech Connect

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    2014-02-01

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

  18. Local neighborhood transition probability estimation and its use in contextual classification

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    The problem of incorporating spatial or contextual information into classifications is considered. A simple model that describes the spatial dependencies between the neighboring pixels with a single parameter, Theta, is presented. Expressions are derived for updating the posteriori probabilities of the states of nature of the pattern under consideration using information from the neighboring patterns, both for spatially uniform context and for Markov dependencies in terms of Theta. Techniques for obtaining the optimal value of the parameter Theta as a maximum likelihood estimate from the local neighborhood of the pattern under consideration are developed.

  19. On the method of logarithmic cumulants for parametric probability density function estimation.

    PubMed

    Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane

    2013-10-01

    Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible. PMID:23799694

  20. Estimated Probability of a Cervical Spine Injury During an ISS Mission

    NASA Technical Reports Server (NTRS)

    Brooker, John E.; Weaver, Aaron S.; Myers, Jerry G.

    2013-01-01

    Introduction: The Integrated Medical Model (IMM) utilizes historical data, cohort data, and external simulations as input factors to provide estimates of crew health, resource utilization and mission outcomes. The Cervical Spine Injury Module (CSIM) is an external simulation designed to provide the IMM with parameter estimates for 1) a probability distribution function (PDF) of the incidence rate, 2) the mean incidence rate, and 3) the standard deviation associated with the mean resulting from injury/trauma of the neck. Methods: An injury mechanism based on an idealized low-velocity blunt impact to the superior posterior thorax of an ISS crewmember was used as the simulated mission environment. As a result of this impact, the cervical spine is inertially loaded from the mass of the head producing an extension-flexion motion deforming the soft tissues of the neck. A multibody biomechanical model was developed to estimate the kinematic and dynamic response of the head-neck system from a prescribed acceleration profile. Logistic regression was performed on a dataset containing AIS1 soft tissue neck injuries from rear-end automobile collisions with published Neck Injury Criterion values producing an injury transfer function (ITF). An injury event scenario (IES) was constructed such that crew 1 is moving through a primary or standard translation path transferring large volume equipment impacting stationary crew 2. The incidence rate for this IES was estimated from in-flight data and used to calculate the probability of occurrence. The uncertainty in the model input factors were estimated from representative datasets and expressed in terms of probability distributions. A Monte Carlo Method utilizing simple random sampling was employed to propagate both aleatory and epistemic uncertain factors. Scatterplots and partial correlation coefficients (PCC) were generated to determine input factor sensitivity. CSIM was developed in the SimMechanics/Simulink environment with a

  1. Effects of river reach discretization on the estimation of the probability of levee failure owing to piping

    NASA Astrophysics Data System (ADS)

    Mazzoleni, Maurizio; Brandimarte, Luigia; Barontini, Stefano; Ranzi, Roberto

    2014-05-01

    Over the centuries many societies have preferred to settle down nearby floodplains area and take advantage of the favorable environmental conditions. Due to changing hydro-meteorological conditions, over time, levee systems along rivers have been raised to protect urbanized area and reduce the impact of floods. As expressed by the so called "levee paradox", many societies might to tend to trust these levee protection systems due to an induced sense of safety and, as a consequence, invest even more in urban developing in levee protected flood prone areas. As a result, considering also the increasing number of population around the world, people living in floodplains is growing. However, human settlements in floodplains are not totally safe and have been continuously endangered by the risk of flooding. In fact, failures of levee system in case of flood event have also produced the most devastating disasters of the last two centuries due to the exposure of the developed floodprone areas to risk. In those cases, property damage is certain, but loss of life can vary dramatically with the extent of the inundation area, the size of the population at risk, and the amount of warning time available. The aim of this study is to propose an innovative methodology to estimate the reliability of a general river levee system in case of piping, considering different sources of uncertainty, and analyze the influence of different discretization of the river reach in sub-reaches in the evaluation of the probability of failure. The reliability analysis, expressed in terms of fragility curve, was performed evaluating the probability of failure, conditioned by a given hydraulic load in case of a certain levee failure mechanism, using a Monte Carlo and First Order Reliability Method. Knowing the information about fragility curve for each discrete levee reach, different fragility indexes were introduced. Using the previous information was then possible to classify the river into sub

  2. The estimation of neurotransmitter release probability in feedforward neuronal network based on adaptive synchronization

    NASA Astrophysics Data System (ADS)

    Xue, Ming; Wang, Jiang; Jia, Chenhui; Yu, Haitao; Deng, Bin; Wei, Xile; Che, Yanqiu

    2013-03-01

    In this paper, we proposed a new approach to estimate unknown parameters and topology of a neuronal network based on the adaptive synchronization control scheme. A virtual neuronal network is constructed as an observer to track the membrane potential of the corresponding neurons in the original network. When they achieve synchronization, the unknown parameters and topology of the original network are obtained. The method is applied to estimate the real-time status of the connection in the feedforward network and the neurotransmitter release probability of unreliable synapses is obtained by statistic computation. Numerical simulations are also performed to demonstrate the effectiveness of the proposed adaptive controller. The obtained results may have important implications in system identification in neural science.

  3. ANNz2 - Photometric redshift and probability density function estimation using machine-learning

    NASA Astrophysics Data System (ADS)

    Sadeh, Iftach

    2014-05-01

    Large photometric galaxy surveys allow the study of questions at the forefront of science, such as the nature of dark energy. The success of such surveys depends on the ability to measure the photometric redshifts of objects (photo-zs), based on limited spectral data. A new major version of the public photo-z estimation software, ANNz , is presented here. The new code incorporates several machine-learning methods, such as artificial neural networks and boosted decision/regression trees, which are all used in concert. The objective of the algorithm is to dynamically optimize the performance of the photo-z estimation, and to properly derive the associated uncertainties. In addition to single-value solutions, the new code also generates full probability density functions in two independent ways.

  4. A method for estimating the probability of lightning causing a methane ignition in an underground mine

    SciTech Connect

    Sacks, H.K.; Novak, T.

    2008-03-15

    During the past decade, several methane/air explosions in abandoned or sealed areas of underground coal mines have been attributed to lightning. Previously published work by the authors showed, through computer simulations, that currents from lightning could propagate down steel-cased boreholes and ignite explosive methane/air mixtures. The presented work expands on the model and describes a methodology based on IEEE Standard 1410-2004 to estimate the probability of an ignition. The methodology provides a means to better estimate the likelihood that an ignition could occur underground and, more importantly, allows the calculation of what-if scenarios to investigate the effectiveness of engineering controls to reduce the hazard. The computer software used for calculating fields and potentials is also verified by comparing computed results with an independently developed theoretical model of electromagnetic field propagation through a conductive medium.

  5. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, Emmanouil

    2016-04-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a innovative statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use, accurate and can be applied to any region and river. Varouchakis, E. A., Giannakis, G. V., Lilli, M. A., Ioannidou, E., Nikolaidis, N. P., and Karatzas, G. P.: Development of a statistical tool for the estimation of riverbank erosion probability, SOIL (EGU), in print, 2016.

  6. Toward 3D-guided prostate biopsy target optimization: an estimation of tumor sampling probabilities

    NASA Astrophysics Data System (ADS)

    Martin, Peter R.; Cool, Derek W.; Romagnoli, Cesare; Fenster, Aaron; Ward, Aaron D.

    2014-03-01

    Magnetic resonance imaging (MRI)-targeted, 3D transrectal ultrasound (TRUS)-guided "fusion" prostate biopsy aims to reduce the ~23% false negative rate of clinical 2D TRUS-guided sextant biopsy. Although it has been reported to double the positive yield, MRI-targeted biopsy still yields false negatives. Therefore, we propose optimization of biopsy targeting to meet the clinician's desired tumor sampling probability, optimizing needle targets within each tumor and accounting for uncertainties due to guidance system errors, image registration errors, and irregular tumor shapes. We obtained multiparametric MRI and 3D TRUS images from 49 patients. A radiologist and radiology resident contoured 81 suspicious regions, yielding 3D surfaces that were registered to 3D TRUS. We estimated the probability, P, of obtaining a tumor sample with a single biopsy. Given an RMS needle delivery error of 3.5 mm for a contemporary fusion biopsy system, P >= 95% for 21 out of 81 tumors when the point of optimal sampling probability was targeted. Therefore, more than one biopsy core must be taken from 74% of the tumors to achieve P >= 95% for a biopsy system with an error of 3.5 mm. Our experiments indicated that the effect of error along the needle axis on the percentage of core involvement (and thus the measured tumor burden) was mitigated by the 18 mm core length.

  7. Estimating the probability of an extinction or major outbreak for an environmentally transmitted infectious disease.

    PubMed

    Lahodny, G E; Gautam, R; Ivanek, R

    2015-01-01

    Indirect transmission through the environment, pathogen shedding by infectious hosts, replication of free-living pathogens within the environment, and environmental decontamination are suspected to play important roles in the spread and control of environmentally transmitted infectious diseases. To account for these factors, the classic Susceptible-Infectious-Recovered-Susceptible epidemic model is modified to include a compartment representing the amount of free-living pathogen within the environment. The model accounts for host demography, direct and indirect transmission, replication of free-living pathogens in the environment, and removal of free-living pathogens by natural death or environmental decontamination. Based on the assumptions of the deterministic model, a continuous-time Markov chain model is developed. An estimate for the probability of disease extinction or a major outbreak is obtained by approximating the Markov chain with a multitype branching process. Numerical simulations illustrate important differences between the deterministic and stochastic counterparts, relevant for outbreak prevention, that depend on indirect transmission, pathogen shedding by infectious hosts, replication of free-living pathogens, and environmental decontamination. The probability of a major outbreak is computed for salmonellosis in a herd of dairy cattle as well as cholera in a human population. An explicit expression for the probability of disease extinction or a major outbreak in terms of the model parameters is obtained for systems with no direct transmission or replication of free-living pathogens. PMID:25198247

  8. Estimation of (n,f) Cross-Sections by Measuring Reaction Probability Ratios

    SciTech Connect

    Plettner, C; Ai, H; Beausang, C W; Bernstein, L A; Ahle, L; Amro, H; Babilon, M; Burke, J T; Caggiano, J A; Casten, R F; Church, J A; Cooper, J R; Crider, B; Gurdal, G; Heinz, A; McCutchan, E A; Moody, K; Punyon, J A; Qian, J; Ressler, J J; Schiller, A; Williams, E; Younes, W

    2005-04-21

    Neutron-induced reaction cross-sections on unstable nuclei are inherently difficult to measure due to target activity and the low intensity of neutron beams. In an alternative approach, named the 'surrogate' technique, one measures the decay probability of the same compound nucleus produced using a stable beam on a stable target to estimate the neutron-induced reaction cross-section. As an extension of the surrogate method, in this paper they introduce a new technique of measuring the fission probabilities of two different compound nuclei as a ratio, which has the advantage of removing most of the systematic uncertainties. This method was benchmarked in this report by measuring the probability of deuteron-induced fission events in coincidence with protons, and forming the ratio P({sup 236}U(d,pf))/P({sup 238}U(d,pf)), which serves as a surrogate for the known cross-section ratio of {sup 236}U(n,f)/{sup 238}U(n,f). IN addition, the P({sup 238}U(d,d{prime}f))/P({sup 236}U(d,d{prime}f)) ratio as a surrogate for the {sup 237}U(n,f)/{sup 235}U(n,f) cross-section ratio was measured for the first time in an unprecedented range of excitation energies.

  9. Inconsistent probability estimates of a hypothesis: the role of contrasting support.

    PubMed

    Bonini, Nicolao; Gonzalez, Michel

    2005-01-01

    This paper studies consistency in the judged probability of a target hypothesis in lists of mutually exclusive nonexhaustive hypotheses. Specifically, it controls the role played by the support of displayed competing hypotheses and the relatedness between the target hypothesis and its alternatives. Three experiments are reported. In all experiments, groups of people were presented with a list of mutually exclusive nonexhaustive causes of a person's death. In the first two experiments, they were asked to judge the probability of each cause as that of the person's decease. In the third experiment, people were asked for a frequency estimation task. Target causes were presented in all lists. Several other alternative causes to the target ones differed across the lists. Findings show that the judged probability/frequency of a target cause changes as a function of the support of the displayed competing causes. Specifically, it is higher when its competing displayed causes have low rather than high support. Findings are consistent with the contrastive support hypothesis within the support theory. PMID:15779531

  10. Estimates of density, detection probability, and factors influencing detection of burrowing owls in the Mojave Desert

    USGS Publications Warehouse

    Crowe, D.E.; Longshore, K.M.

    2010-01-01

    We estimated relative abundance and density of Western Burrowing Owls (Athene cunicularia hypugaea) at two sites in the Mojave Desert (200304). We made modifications to previously established Burrowing Owl survey techniques for use in desert shrublands and evaluated several factors that might influence the detection of owls. We tested the effectiveness of the call-broadcast technique for surveying this species, the efficiency of this technique at early and late breeding stages, and the effectiveness of various numbers of vocalization intervals during broadcasting sessions. Only 1 (3) of 31 initial (new) owl responses was detected during passive-listening sessions. We found that surveying early in the nesting season was more likely to produce new owl detections compared to surveying later in the nesting season. New owls detected during each of the three vocalization intervals (each consisting of 30 sec of vocalizations followed by 30 sec of silence) of our broadcasting session were similar (37, 40, and 23; n 30). We used a combination of detection trials (sighting probability) and double-observer method to estimate the components of detection probability, i.e., availability and perception. Availability for all sites and years, as determined by detection trials, ranged from 46.158.2. Relative abundance, measured as frequency of occurrence and defined as the proportion of surveys with at least one owl, ranged from 19.232.0 for both sites and years. Density at our eastern Mojave Desert site was estimated at 0.09 ?? 0.01 (SE) owl territories/km2 and 0.16 ?? 0.02 (SE) owl territories/km2 during 2003 and 2004, respectively. In our southern Mojave Desert site, density estimates were 0.09 ?? 0.02 (SE) owl territories/km2 and 0.08 ?? 0.02 (SE) owl territories/km 2 during 2004 and 2005, respectively. ?? 2010 The Raptor Research Foundation, Inc.

  11. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    PubMed

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load

  12. Estimated probability of postwildfire debris flows in the 2012 Whitewater-Baldy Fire burn area, southwestern New Mexico

    USGS Publications Warehouse

    Tillery, Anne C.; Matherne, Anne Marie; Verdin, Kristine L.

    2012-01-01

    In May and June 2012, the Whitewater-Baldy Fire burned approximately 1,200 square kilometers (300,000 acres) of the Gila National Forest, in southwestern New Mexico. The burned landscape is now at risk of damage from postwildfire erosion, such as that caused by debris flows and flash floods. This report presents a preliminary hazard assessment of the debris-flow potential from 128 basins burned by the Whitewater-Baldy Fire. A pair of empirical hazard-assessment models developed by using data from recently burned basins throughout the intermountain Western United States was used to estimate the probability of debris-flow occurrence and volume of debris flows along the burned area drainage network and for selected drainage basins within the burned area. The models incorporate measures of areal burned extent and severity, topography, soils, and storm rainfall intensity to estimate the probability and volume of debris flows following the fire. In response to the 2-year-recurrence, 30-minute-duration rainfall, modeling indicated that four basins have high probabilities of debris-flow occurrence (greater than or equal to 80 percent). For the 10-year-recurrence, 30-minute-duration rainfall, an additional 14 basins are included, and for the 25-year-recurrence, 30-minute-duration rainfall, an additional eight basins, 20 percent of the total, have high probabilities of debris-flow occurrence. In addition, probability analysis along the stream segments can identify specific reaches of greatest concern for debris flows within a basin. Basins with a high probability of debris-flow occurrence were concentrated in the west and central parts of the burned area, including tributaries to Whitewater Creek, Mineral Creek, and Willow Creek. Estimated debris-flow volumes ranged from about 3,000-4,000 cubic meters (m3) to greater than 500,000 m3 for all design storms modeled. Drainage basins with estimated volumes greater than 500,000 m3 included tributaries to Whitewater Creek, Willow

  13. Fast method for the estimation of impact probability of near-Earth objects

    NASA Astrophysics Data System (ADS)

    Vavilov, D.; Medvedev, Y.

    2014-07-01

    We propose a method to estimate the probability of collision of a celestial body with the Earth (or another major planet) at a given time moment t. Let there be a set of observations of a small body. At initial time moment T_0, a nominal orbit is defined by the least squares method. In our method, a unique coordinate system is used. It is supposed that errors of observations are related to errors of coordinates and velocities linearly and the distribution law of observation errors is normal. The unique frame is defined as follows. First of all, we fix an osculating ellipse of the body's orbit at the time moment t. The mean anomaly M in this osculating ellipse is a coordinate of the introduced system. The spatial coordinate ξ is perpendicular to the plane which contains the fixed ellipse. η is a spatial coordinate, too, and our axes satisfy the right-hand rule. The origin of ξ and η corresponds to the given M point on the ellipse. The components of the velocity are the corresponding derivatives of dotξ, dotη, dot{M}. To calculate the probability of collision, we numerically integrate equations of an asteroid's motion taking into account perturbations and calculate a normal matrix N. The probability is determinated as follows: P = {|detN|^{ {1}/{2} }}/{ (2π)^3 } int_Ω e^{ - {1}/{2} x^T N x } dx where x denotes a six-dimensional vector of coordinates and velocities, Ω is the region which is occupied by the Earth, and the superscript T denotes the matrix transpose operation. To take into account a gravitational attraction of the Earth, the radius of the Earth is increased by √{1 + {v_s^2}/{v_{rel}^2} } times, where v_s is the escape velocity and v_{rel} is the small body's velocity relative to the Earth. The 6-dimensional integral is analytically integrated over the velocity components on (-∞,+∞). After that we have the 3×3 matrix N_1. That 6-dimensional integral becomes a 3-dimensional integral. To calculate it quickly we do the following. We introduce

  14. Probability Density Estimation Using Isocontours and Isosurfaces: Application to Information-Theoretic Image Registration

    PubMed Central

    Rajwade, Ajit; Banerjee, Arunava; Rangarajan, Anand

    2010-01-01

    We present a new geometric approach for determining the probability density of the intensity values in an image. We drop the notion of an image as a set of discrete pixels and assume a piecewise-continuous representation. The probability density can then be regarded as being proportional to the area between two nearby isocontours of the image surface. Our paper extends this idea to joint densities of image pairs. We demonstrate the application of our method to affine registration between two or more images using information-theoretic measures such as mutual information. We show cases where our method outperforms existing methods such as simple histograms, histograms with partial volume interpolation, Parzen windows, etc., under fine intensity quantization for affine image registration under significant image noise. Furthermore, we demonstrate results on simultaneous registration of multiple images, as well as for pairs of volume data sets, and show some theoretical properties of our density estimator. Our approach requires the selection of only an image interpolant. The method neither requires any kind of kernel functions (as in Parzen windows), which are unrelated to the structure of the image in itself, nor does it rely on any form of sampling for density estimation. PMID:19147876

  15. Probability estimation with machine learning methods for dichotomous and multicategory outcome: applications.

    PubMed

    Kruppa, Jochen; Liu, Yufeng; Diener, Hans-Christian; Holste, Theresa; Weimar, Christian; König, Inke R; Ziegler, Andreas

    2014-07-01

    Machine learning methods are applied to three different large datasets, all dealing with probability estimation problems for dichotomous or multicategory data. Specifically, we investigate k-nearest neighbors, bagged nearest neighbors, random forests for probability estimation trees, and support vector machines with the kernels of Bessel, linear, Laplacian, and radial basis type. Comparisons are made with logistic regression. The dataset from the German Stroke Study Collaboration with dichotomous and three-category outcome variables allows, in particular, for temporal and external validation. The other two datasets are freely available from the UCI learning repository and provide dichotomous outcome variables. One of them, the Cleveland Clinic Foundation Heart Disease dataset, uses data from one clinic for training and from three clinics for external validation, while the other, the thyroid disease dataset, allows for temporal validation by separating data into training and test data by date of recruitment into study. For dichotomous outcome variables, we use receiver operating characteristics, areas under the curve values with bootstrapped 95% confidence intervals, and Hosmer-Lemeshow-type figures as comparison criteria. For dichotomous and multicategory outcomes, we calculated bootstrap Brier scores with 95% confidence intervals and also compared them through bootstrapping. In a supplement, we provide R code for performing the analyses and for random forest analyses in Random Jungle, version 2.1.0. The learning machines show promising performance over all constructed models. They are simple to apply and serve as an alternative approach to logistic or multinomial logistic regression analysis. PMID:24989843

  16. Oil-spill risk analysis: Cook inlet outer continental shelf lease sale 149. Volume 2: Conditional risk contour maps of seasonal conditional probabilities. Final report

    SciTech Connect

    Johnson, W.R.; Marshall, C.F.; Anderson, C.M.; Lear, E.M.

    1994-08-01

    The Federal Government has proposed to offer Outer Continental Shelf (OCS) lands in Cook Inlet for oil and gas leasing. Because oil spills may occur from activities associated with offshore oil production, the Minerals Management Service conducts a formal risk assessment. In evaluating the significance of accidental oil spills, it is important to remember that the occurrence of such spills is fundamentally probabilistic. The effects of oil spills that could occur during oil and gas production must be considered. This report summarizes results of an oil-spill risk analysis conducted for the proposed Cook Inlet OCS Lease Sale 149. The objective of this analysis was to estimate relative risks associated with oil and gas production for the proposed lease sale. To aid the analysis, conditional risk contour maps of seasonal conditional probabilities of spill contact were generated for each environmental resource or land segment in the study area. This aspect is discussed in this volume of the two volume report.

  17. Estimation of drought transition probabilities in Sicily making use of exogenous variables

    NASA Astrophysics Data System (ADS)

    Bonaccorso, Brunella; di Mauro, Giuseppe; Cancelliere, Antonino; Rossi, Giuseppe

    2010-05-01

    Drought monitoring and forecasting play a very important role for an effective drought management. A timely monitoring of drought features and/or forecasting of an incoming drought do make possible an effective mitigation of its impacts, more than in the case of other natural disasters (e.g. floods, earthquakes, hurricanes, etc.). An accurate selection of indices, able to monitor the main characteristics of droughts, is essential to help decision makers to implement appropriate preparedness and mitigation measures. Among the several proposed indices for drought monitoring, the Standardized Precipitation Index (SPI) has found widespread use to monitor dry and wet periods of precipitation aggregated at different time scales. Recently, some efforts have been made to analyze the role of SPI for drought forecasting, as well as to estimate transition probabilities between drought classes. In the present work, a model able to estimate transition probabilities from a current SPI drought class or from a current SPI value to future classes, corresponding to droughts of different severities, is presented and extended in order to include information provided by an exogenous variable, such as a large scale climatic index as the North Atlantic Oscillation Index (NAO). The model has been preliminarily applied and tested with reference to SPI series computed on average areal precipitation in Sicily island, Italy, making use of NAO as exogenous variable. Results seem to indicate that winter drought transition probabilities in Sicily are generally affected by NAO index. Furthermore, the statistical significance of such influence has been tested by means of a Montecarlo analysis, which indicates that the effect of NAO on drought transition in Sicily should be considered significant.

  18. Estimating Probabilities of Peptide Database Identifications to LC-FTICR-MS Observations

    SciTech Connect

    Anderson, Kevin K.; Monroe, Matthew E.; Daly, Don S.

    2006-02-24

    One of the grand challenges in the post-genomic era is proteomics, the characterization of the proteins expressed in a cell under specific conditions. A promising technology for high-throughput proteomics is mass spectrometry, specifically liquid chromatography coupled to Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR-MS). The accuracy and certainty of the determinations of peptide identities and abundances provided by LC-FTICR-MS are an important and necessary component of systems biology research. Methods: After a tryptically digested protein mixture is analyzed by LC-FTICR-MS, the observed masses and normalized elution times of the detected features are statistically matched to the theoretical masses and elution times of known peptides listed in a large database. The probability of matching is estimated for each peptide in the reference database using statistical classification methods assuming bivariate Gaussian probability distributions on the uncertainties in the masses and the normalized elution times. A database of 69,220 features from 32 LC-FTICR-MS analyses of a tryptically digested bovine serum albumin (BSA) sample was matched to a database populated with 97% false positive peptides. The percentage of high confidence identifications was found to be consistent with other database search procedures. BSA database peptides were identified with high confidence on average in 14.1 of the 32 analyses. False positives were identified on average in just 2.7 analyses. Using a priori probabilities that contrast peptides from expected and unexpected proteins was shown to perform better in identifying target peptides than using equally likely a priori probabilities. This is because a large percentage of the target peptides were similar to unexpected peptides which were included to be false positives. The use of triplicate analyses with a ''2 out of 3'' reporting rule was shown to have excellent rejection of false positives.

  19. A Finding Method of Business Risk Factors Using Characteristics of Probability Distributions of Effect Ratios on Qualitative and Quantitative Hybrid Simulation

    NASA Astrophysics Data System (ADS)

    Samejima, Masaki; Negoro, Keisuke; Mitsukuni, Koshichiro; Akiyoshi, Masanori

    We propose a finding method of business risk factors on qualitative and quantitative hybrid simulation in time series. Effect ratios of qualitative arcs in the hybrid simulation vary output values of the simulation, so we define effect ratios causing risk as business risk factors. Finding business risk factors in entire ranges of effect ratios is time-consuming. It is considered that probability distributions of effect ratios in present time step and ones in previous time step are similar, the probability distributions in present time step can be estimated. Our method finds business risk factors in only estimated ranges effectively. Experimental results show that a precision rate and a recall rate are 86%, and search time is decreased 20% at least.

  20. Nonparametric estimation with recurrent competing risks data

    PubMed Central

    Peña, Edsel A.

    2014-01-01

    Nonparametric estimators of component and system life distributions are developed and presented for situations where recurrent competing risks data from series systems are available. The use of recurrences of components’ failures leads to improved efficiencies in statistical inference, thereby leading to resource-efficient experimental or study designs or improved inferences about the distributions governing the event times. Finite and asymptotic properties of the estimators are obtained through simulation studies and analytically. The detrimental impact of parametric model misspecification is also vividly demonstrated, lending credence to the virtue of adopting nonparametric or semiparametric models, especially in biomedical settings. The estimators are illustrated by applying them to a data set pertaining to car repairs for vehicles that were under warranty. PMID:24072583

  1. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  2. Medical concepts related to individual risk are better explained with "plausibility" rather than "probability"

    PubMed Central

    Grossi, Enzo

    2005-01-01

    Background The concept of risk has pervaded medical literature in the last decades and has become a familiar topic, and the concept of probability, linked to binary logic approach, is commonly applied in epidemiology and clinical medicine. The application of probability theory to groups of individuals is quite straightforward but can pose communication challenges at individual level. Few articles by the way have tried to focus the concept of "risk" at the individual subject level rather than at population level. Discussion The author has reviewed the conceptual framework which has led to the use of probability theory in the medical field in a time when the principal causes of death were represented by acute disease often of infective origin. In the present scenario, in which chronic degenerative disease dominate and there are smooth transitions between health and disease the use of fuzzy logic rather than binary logic would be more appropriate. The use of fuzzy logic in which more than two possible truth-value assignments are allowed overcomes the trap of probability theory when dealing with uncertain outcomes, thereby making the meaning of a certain prognostic statement easier to understand by the patient. Summary At individual subject level the recourse to the term plausibility, related to fuzzy logic, would help the physician to communicate to the patient more efficiently in comparison with the term probability, related to binary logic. This would represent an evident advantage for the transfer of medical evidences to individual subjects. PMID:16188041

  3. Auditory risk estimates for youth target shooting

    PubMed Central

    Meinke, Deanna K.; Murphy, William J.; Finan, Donald S.; Lankford, James E.; Flamme, Gregory A.; Stewart, Michael; Soendergaard, Jacob; Jerome, Trevor W.

    2015-01-01

    Objective To characterize the impulse noise exposure and auditory risk for youth recreational firearm users engaged in outdoor target shooting events. The youth shooting positions are typically standing or sitting at a table, which places the firearm closer to the ground or reflective surface when compared to adult shooters. Design Acoustic characteristics were examined and the auditory risk estimates were evaluated using contemporary damage-risk criteria for unprotected adult listeners and the 120-dB peak limit suggested by the World Health Organization (1999) for children. Study sample Impulses were generated by 26 firearm/ammunition configurations representing rifles, shotguns, and pistols used by youth. Measurements were obtained relative to a youth shooter’s left ear. Results All firearms generated peak levels that exceeded the 120 dB peak limit suggested by the WHO for children. In general, shooting from the seated position over a tabletop increases the peak levels, LAeq8 and reduces the unprotected maximum permissible exposures (MPEs) for both rifles and pistols. Pistols pose the greatest auditory risk when fired over a tabletop. Conclusion Youth should utilize smaller caliber weapons, preferably from the standing position, and always wear hearing protection whenever engaging in shooting activities to reduce the risk for auditory damage. PMID:24564688

  4. ASSESSMENT OF METHODS FOR ESTIMATING RISK TO BIRDS FROM INGESTION OF CONTAMINATED GRIT PARTICLES (FINAL REPORT)

    EPA Science Inventory

    The report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mo...

  5. Estimating present day extreme water level exceedance probabilities around the coastline of Australia: tides, extra-tropical storm surges and mean sea level

    NASA Astrophysics Data System (ADS)

    Haigh, Ivan D.; Wijeratne, E. M. S.; MacPherson, Leigh R.; Pattiaratchi, Charitha B.; Mason, Matthew S.; Crompton, Ryan P.; George, Steve

    2014-01-01

    The occurrence of extreme water levels along low-lying, highly populated and/or developed coastlines can lead to considerable loss of life and billions of dollars of damage to coastal infrastructure. Therefore it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood management, engineering and future land-use planning. This ensures the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. This paper estimates for the first time present day extreme water level exceedence probabilities around the whole coastline of Australia. A high-resolution depth averaged hydrodynamic model has been configured for the Australian continental shelf region and has been forced with tidal levels from a global tidal model and meteorological fields from a global reanalysis to generate a 61-year hindcast of water levels. Output from this model has been successfully validated against measurements from 30 tide gauge sites. At each numeric coastal grid point, extreme value distributions have been fitted to the derived time series of annual maxima and the several largest water levels each year to estimate exceedence probabilities. This provides a reliable estimate of water level probabilities around southern Australia; a region mainly impacted by extra-tropical cyclones. However, as the meteorological forcing used only weakly includes the effects of tropical cyclones, extreme water level probabilities are underestimated around the western, northern and north-eastern Australian coastline. In a companion paper we build on the work presented here and more accurately include tropical cyclone-induced surges in the estimation of extreme water level. The multi-decadal hindcast generated here has been used primarily to estimate extreme water level exceedance probabilities but could be used more widely in the future for a variety of other research and practical

  6. Accretion of Fine Particles: Sticking Probability Estimated by Optical Sizing of Fractal Aggregates

    NASA Astrophysics Data System (ADS)

    Sugiura, N.; Higuchi, Y.

    1993-07-01

    Sticking probability of fine particles is an important parameter that determines (1) the settling of fine particles to the equatorial plane of the solar nebula and hence the formation of planetesimals, and (2) the thermal structure of the nebula, which is dependent on the particle size through opacity. It is generally agreed that the sticking probability is 1 for submicrometer particles, but at sizes larger than 1 micrometer, there exist almost no data on the sticking probability. A recent study [1] showed that aggregates (with radius from 0.2 to 2 mm) did not stick when collided at a speed of 0.15 to 4 m/s. Therefore, somewhere between 1 micrometer and 200 micrometers, sticking probabilities of fine particles change from nearly 1 to nearly 0. We have been studying [2,3] sticking probabilities of dust aggregates in this size range using an optical sizing method. The optical sizing method has been well established for spherical particles. This method utilizes the fact that the smaller the size, the larger the angle of the scattered light. For spheres with various sizes, the size distribution is determined by solving Y(i) = M(i,j)X(j), where Y(i) is the scattered light intensity at angle i, X(j) is the number density of spheres with size j, and M(i,j) is the scattering matrix, which is determined by Mie theory. Dust aggregates, which we expect to be present in the early solar nebula, are not solid spheres, but probably have a porous fractal structure. For such aggregates the scattering matrix M(i,j) must be determined by taking account of all the interaction among constituent particles (discrete dipole approximation). Such calculation is possible only for very small aggregates, and for larger aggregates we estimate the scattering matrix by extrapolation, assuming that the fractal nature of the aggregates allows such extrapolation. In the experiments using magnesium oxide fine particles floating in a chamber at ambient pressure, the size distribution (determined by

  7. SAR amplitude probability density function estimation based on a generalized Gaussian model.

    PubMed

    Moser, Gabriele; Zerubia, Josiane; Serpico, Sebastiano B

    2006-06-01

    In the context of remotely sensed data analysis, an important problem is the development of accurate models for the statistics of the pixel intensities. Focusing on synthetic aperture radar (SAR) data, this modeling process turns out to be a crucial task, for instance, for classification or for denoising purposes. In this paper, an innovative parametric estimation methodology for SAR amplitude data is proposed that adopts a generalized Gaussian (GG) model for the complex SAR backscattered signal. A closed-form expression for the corresponding amplitude probability density function (PDF) is derived and a specific parameter estimation algorithm is developed in order to deal with the proposed model. Specifically, the recently proposed "method-of-log-cumulants" (MoLC) is applied, which stems from the adoption of the Mellin transform (instead of the usual Fourier transform) in the computation of characteristic functions and from the corresponding generalization of the concepts of moment and cumulant. For the developed GG-based amplitude model, the resulting MoLC estimates turn out to be numerically feasible and are also analytically proved to be consistent. The proposed parametric approach was validated by using several real ERS-1, XSAR, E-SAR, and NASA/JPL airborne SAR images, and the experimental results prove that the method models the amplitude PDF better than several previously proposed parametric models for backscattering phenomena. PMID:16764268

  8. Methods for estimating dispersal probabilities and related parameters using marked animals

    USGS Publications Warehouse

    Bennetts, R.E.; Nichols, J.D.; Pradel, R.; Lebreton, J.D.; Kitchens, W.M.

    2001-01-01

    Deriving valid inferences about the causes and consequences of dispersal from empirical studies depends largely on our ability reliably to estimate parameters associated with dispersal. Here, we present a review of the methods available for estimating dispersal and related parameters using marked individuals. We emphasize methods that place dispersal in a probabilistic framework. In this context, we define a dispersal event as a movement of a specified distance or from one predefined patch to another, the magnitude of the distance or the definition of a `patch? depending on the ecological or evolutionary question(s) being addressed. We have organized the chapter based on four general classes of data for animals that are captured, marked, and released alive: (1) recovery data, in which animals are recovered dead at a subsequent time, (2) recapture/resighting data, in which animals are either recaptured or resighted alive on subsequent sampling occasions, (3) known-status data, in which marked animals are reobserved alive or dead at specified times with probability 1.0, and (4) combined data, in which data are of more than one type (e.g., live recapture and ring recovery). For each data type, we discuss the data required, the estimation techniques, and the types of questions that might be addressed from studies conducted at single and multiple sites.

  9. Dynamic probability control limits for risk-adjusted Bernoulli CUSUM charts.

    PubMed

    Zhang, Xiang; Woodall, William H

    2015-11-10

    The risk-adjusted Bernoulli cumulative sum (CUSUM) chart developed by Steiner et al. (2000) is an increasingly popular tool for monitoring clinical and surgical performance. In practice, however, the use of a fixed control limit for the chart leads to a quite variable in-control average run length performance for patient populations with different risk score distributions. To overcome this problem, we determine simulation-based dynamic probability control limits (DPCLs) patient-by-patient for the risk-adjusted Bernoulli CUSUM charts. By maintaining the probability of a false alarm at a constant level conditional on no false alarm for previous observations, our risk-adjusted CUSUM charts with DPCLs have consistent in-control performance at the desired level with approximately geometrically distributed run lengths. Our simulation results demonstrate that our method does not rely on any information or assumptions about the patients' risk distributions. The use of DPCLs for risk-adjusted Bernoulli CUSUM charts allows each chart to be designed for the corresponding particular sequence of patients for a surgeon or hospital. PMID:26037959

  10. Estimated Probability of Traumatic Abdominal Injury During an International Space Station Mission

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Brooker, John E.; Weavr, Aaron S.; Myers, Jerry G., Jr.; McRae, Michael P.

    2013-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to spaceflight mission planners and medical system designers when assessing risks and optimizing medical systems. The IMM project maintains a database of medical conditions that could occur during a spaceflight. The IMM project is in the process of assigning an incidence rate, the associated functional impairment, and a best and a worst case end state for each condition. The purpose of this work was to develop the IMM Abdominal Injury Module (AIM). The AIM calculates an incidence rate of traumatic abdominal injury per person-year of spaceflight on the International Space Station (ISS). The AIM was built so that the probability of traumatic abdominal injury during one year on ISS could be predicted. This result will be incorporated into the IMM Abdominal Injury Clinical Finding Form and used within the parent IMM model.

  11. Comparison of disjunctive kriging to generalized probability kriging in application to the estimation of simulated and real data

    SciTech Connect

    Carr, J.R. . Dept. of Geological Sciences); Mao, Nai-hsien )

    1992-01-01

    Disjunctive kriging has been compared previously to multigaussian kriging and indicator cokriging for estimation of cumulative distribution functions; it has yet to be compared extensively to probability kriging. Herein, disjunctive kriging and generalized probability kriging are applied to one real and one simulated data set and compared for estimation of the cumulative distribution functions. Generalized probability kriging is an extension, based on generalized cokriging theory, of simple probability kriging for the estimation of the indicator and uniform transforms at each cutoff, Z{sub k}. The disjunctive kriging and the generalized probability kriging give similar results for simulated data of normal distribution, but differ considerably for real data set with non-normal distribution.

  12. Reexamination of spent fuel shipment risk estimates

    SciTech Connect

    COOK,J.R.; SPRUNG,JEREMY L.

    2000-04-25

    The risks associated with the transport of spent nuclear fuel by truck and rail have been reexamined and compared to results published in NUREG-O170 and the Modal Study. The full reexamination considered transport of PWR and BWR spent fuel by truck and rail in four generic Type B spent fuel casks. Because they are typical, this paper presents results only for transport of PWR spent fuel in steel-lead steel casks. Cask and spent fuel response to collision impacts and fires were evaluated by performing three-dimensional finite element and one-dimensional heat transport calculations. Accident release fractions were developed by critical review of literature data. Accident severity fractions were developed from Modal Study truck and rail accident event trees, modified to reflect the frequency of occurrence of hard and soft rock wayside route surfaces as determined by analysis of geographic data. Incident-free population doses and the population dose risks associated with the accidents that might occur during transport were calculated using the RADTRAN 5 transportation risk code. The calculated incident-free doses were compared to those published in NUREG-O170. The calculated accident dose risks were compared to dose risks calculated using NUREG-0170 and Modal Study accident source terms. The comparisons demonstrated that both of these studies made a number of very conservative assumptions about spent fuel and cask response to accident conditions, which caused their estimates of accident source terms, accident frequencies, and accident consequences to also be very conservative. The results of this study and the previous studies demonstrate that the risks associated with the shipment of spent fuel by truck or rail are very small.

  13. Estimating species occurrence, abundance, and detection probability using zero-inflated distributions

    USGS Publications Warehouse

    Wenger, S.J.; Freeman, Mary C.

    2008-01-01

    Researchers have developed methods to account for imperfect detection of species with either occupancy (presence-absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.

  14. EROS --- automated software system for ephemeris calculation and estimation of probability domain (Abstract)

    NASA Astrophysics Data System (ADS)

    Skripnichenko, P.; Galushina, T.; Loginova, M.

    2015-08-01

    This work is devoted to the description of the software EROS (Ephemeris Research and Observation Services), which is being developed both by the astronomy department of Ural Federal University and Tomsk State University. This software provides the ephemeris support for the positional observations. The most interesting feature of the software is an automatization of all the processes preparation for observations from the determination of the night duration to the ephemeris calculation and forming of a program observation schedule. The accuracy of ephemeris calculation mostly depends on initial data precision that defined from errors of observations which used to determination of orbital elements. In the case if object has a small number of observations which spread at short arc of orbit there is a real necessity to calculate not only at nominal orbit but probability domain both. In this paper under review ephemeris we will be understand a field on the celestial sphere which calculated based on the probability domain. Our software EROS has a relevant functional for estimation of review ephemeris. This work contains description of software system and results of the program using.

  15. Estimating the ground-state probability of a quantum simulation with product-state measurements

    NASA Astrophysics Data System (ADS)

    Yoshimura, Bryce; Freericks, James

    2015-10-01

    .One of the goals in quantum simulation is to adiabatically generate the ground state of a complicated Hamiltonian by starting with the ground state of a simple Hamiltonian and slowly evolving the system to the complicated one. If the evolution is adiabatic and the initial and final ground states are connected due to having the same symmetry, then the simulation will be successful. But in most experiments, adiabatic simulation is not possible because it would take too long, and the system has some level of diabatic excitation. In this work, we quantify the extent of the diabatic excitation even if we do not know a priori what the complicated ground state is. Since many quantum simulator platforms, like trapped ions, can measure the probabilities to be in a product state, we describe techniques that can employ these simple measurements to estimate the probability of being in the ground state of the system after the diabatic evolution. These techniques do not require one to know any properties about the Hamiltonian itself, nor to calculate its eigenstate properties. All the information is derived by analyzing the product-state measurements as functions of time.

  16. Estimating Landholders’ Probability of Participating in a Stewardship Program, and the Implications for Spatial Conservation Priorities

    PubMed Central

    Adams, Vanessa M.; Pressey, Robert L.; Stoeckl, Natalie

    2014-01-01

    The need to integrate social and economic factors into conservation planning has become a focus of academic discussions and has important practical implications for the implementation of conservation areas, both private and public. We conducted a survey in the Daly Catchment, Northern Territory, to inform the design and implementation of a stewardship payment program. We used a choice model to estimate the likely level of participation in two legal arrangements - conservation covenants and management agreements - based on payment level and proportion of properties required to be managed. We then spatially predicted landholders’ probability of participating at the resolution of individual properties and incorporated these predictions into conservation planning software to examine the potential for the stewardship program to meet conservation objectives. We found that the properties that were least costly, per unit area, to manage were also the least likely to participate. This highlights a tension between planning for a cost-effective program and planning for a program that targets properties with the highest probability of participation. PMID:24892520

  17. Estimating landholders' probability of participating in a stewardship program, and the implications for spatial conservation priorities.

    PubMed

    Adams, Vanessa M; Pressey, Robert L; Stoeckl, Natalie

    2014-01-01

    The need to integrate social and economic factors into conservation planning has become a focus of academic discussions and has important practical implications for the implementation of conservation areas, both private and public. We conducted a survey in the Daly Catchment, Northern Territory, to inform the design and implementation of a stewardship payment program. We used a choice model to estimate the likely level of participation in two legal arrangements--conservation covenants and management agreements--based on payment level and proportion of properties required to be managed. We then spatially predicted landholders' probability of participating at the resolution of individual properties and incorporated these predictions into conservation planning software to examine the potential for the stewardship program to meet conservation objectives. We found that the properties that were least costly, per unit area, to manage were also the least likely to participate. This highlights a tension between planning for a cost-effective program and planning for a program that targets properties with the highest probability of participation. PMID:24892520

  18. Detection probabilities and site occupancy estimates for amphibians at Okefenokee National Wildlife Refuge

    USGS Publications Warehouse

    Smith, L.L.; Barichivich, W.J.; Staiger, J.S.; Smith, Kimberly G.; Dodd, C.K., Jr.

    2006-01-01

    We conducted an amphibian inventory at Okefenokee National Wildlife Refuge from August 2000 to June 2002 as part of the U.S. Department of the Interior's national Amphibian Research and Monitoring Initiative. Nineteen species of amphibians (15 anurans and 4 caudates) were documented within the Refuge, including one protected species, the Gopher Frog Rana capito. We also collected 1 y of monitoring data for amphibian populations and incorporated the results into the inventory. Detection probabilities and site occupancy estimates for four species, the Pinewoods Treefrog (Hyla femoralis), Pig Frog (Rana grylio), Southern Leopard Frog (R. sphenocephala) and Carpenter Frog (R. virgatipes) are presented here. Detection probabilities observed in this study indicate that spring and summer surveys offer the best opportunity to detect these species in the Refuge. Results of the inventory suggest that substantial changes may have occurred in the amphibian fauna within and adjacent to the swamp. However, monitoring the amphibian community of Okefenokee Swamp will prove difficult because of the logistical challenges associated with a rigorous statistical assessment of status and trends.

  19. Measuring and Modeling Fault Density for Plume-Fault Encounter Probability Estimation

    SciTech Connect

    Jordan, P.D.; Oldenburg, C.M.; Nicot, J.-P.

    2011-05-15

    Emission of carbon dioxide from fossil-fueled power generation stations contributes to global climate change. Storage of this carbon dioxide within the pores of geologic strata (geologic carbon storage) is one approach to mitigating the climate change that would otherwise occur. The large storage volume needed for this mitigation requires injection into brine-filled pore space in reservoir strata overlain by cap rocks. One of the main concerns of storage in such rocks is leakage via faults. In the early stages of site selection, site-specific fault coverages are often not available. This necessitates a method for using available fault data to develop an estimate of the likelihood of injected carbon dioxide encountering and migrating up a fault, primarily due to buoyancy. Fault population statistics provide one of the main inputs to calculate the encounter probability. Previous fault population statistics work is shown to be applicable to areal fault density statistics. This result is applied to a case study in the southern portion of the San Joaquin Basin with the result that the probability of a carbon dioxide plume from a previously planned injection had a 3% chance of encountering a fully seal offsetting fault.

  20. Relating space radiation environments to risk estimates

    NASA Technical Reports Server (NTRS)

    Curtis, Stanley B.

    1993-01-01

    A number of considerations must go into the process of determining the risk of deleterious effects of space radiation to travelers. Among them are (1) determination of the components of the radiation environment (particle species, fluxes and energy spectra) which will encounter, (2) determination of the effects of shielding provided by the spacecraft and the bodies of the travelers which modify the incident particle spectra and mix of particles, and (3) determination of relevant biological effects of the radiation in the organs of interest. The latter can then lead to an estimation of risk from a given space scenario. Clearly, the process spans many scientific disciplines from solar and cosmic ray physics to radiation transport theeory to the multistage problem of the induction by radiation of initial lesions in living material and their evolution via physical, chemical, and biological processes at the molecular, cellular, and tissue levels to produce the end point of importance.

  1. Relating space radiation environments to risk estimates

    SciTech Connect

    Curtis, S.B. ||

    1993-12-31

    A number of considerations must go into the process of determining the risk of deleterious effects of space radiation to travelers. Among them are (1) determination of the components of the radiation environment (particle species, fluxes and energy spectra) which will encounter, (2) determination of the effects of shielding provided by the spacecraft and the bodies of the travelers which modify the incident particle spectra and mix of particles, and (3) determination of relevant biological effects of the radiation in the organs of interest. The latter can then lead to an estimation of risk from a given space scenario. Clearly, the process spans many scientific disciplines from solar and cosmic ray physics to radiation transport theeory to the multistage problem of the induction by radiation of initial lesions in living material and their evolution via physical, chemical, and biological processes at the molecular, cellular, and tissue levels to produce the end point of importance.

  2. Estimation of the failure probability during EGS stimulation based on borehole data

    NASA Astrophysics Data System (ADS)

    Meller, C.; Kohl, Th.; Gaucher, E.

    2012-04-01

    In recent times the search for alternative sources of energy has been fostered by the scarcity of fossil fuels. With its ability to permanently provide electricity or heat with little emission of CO2, geothermal energy will have an important share in the energy mix of the future. Within Europe, scientists identified many locations with conditions suitable for Enhanced Geothermal System (EGS) projects. In order to provide sufficiently high reservoir permeability, EGS require borehole stimulations prior to installation of power plants (Gérard et al, 2006). Induced seismicity during water injection into reservoirs EGS systems is a factor that currently cannot be predicted nor controlled. Often, people living near EGS projects are frightened by smaller earthquakes occurring during stimulation or injection. As this fear can lead to widespread disapproval of geothermal power plants, it is appreciable to find a way to estimate the probability of fractures to shear when injecting water with a distinct pressure into a geothermal reservoir. This provides knowledge, which enables to predict the mechanical behavior of a reservoir in response to a change in pore pressure conditions. In the present study an approach for estimation of the shearing probability based on statistical analyses of fracture distribution, orientation and clusters, together with their geological properties is proposed. Based on geophysical logs of five wells in Soultz-sous-Forêts, France, and with the help of statistical tools, the Mohr criterion, geological and mineralogical properties of the host rock and the fracture fillings, correlations between the wells are analyzed. This is achieved with the self-written MATLAB-code Fracdens, which enables us to statistically analyze the log files in different ways. With the application of a pore pressure change, the evolution of the critical pressure on the fractures can be determined. A special focus is on the clay fillings of the fractures and how they reduce

  3. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    PubMed Central

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K.; Schork, Andrew; Chen, Chi-Hua; Lo, Min-Tzu; Witoelar, Aree; Werge, Thomas; O'Donovan, Michael; Andreassen, Ole A.; Dale, Anders M.

    2016-01-01

    Genome-wide Association Studies (GWAS) result in millions of summary statistics (“z-scores”) for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric disorders, which are understood to have substantial genetic components that arise from very large numbers of SNPs. The complexity of the datasets, however, poses a significant challenge to maximizing their utility. This is reflected in a need for better understanding the landscape of z-scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing for direct empirical validation. We show that modeling z-scores as a mixture of Gaussians is conceptually appropriate, in particular taking into account ubiquitous non-null effects that are likely in the datasets due to weak linkage disequilibrium with causal SNPs. The four-parameter model allows for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately 9.3 million SNP z-scores in both cases. We show that, over a broad range of z-scores and sample sizes, the model accurately predicts expectation estimates of true effect sizes and replication probabilities in multistage GWAS designs. We assess the degree to which effect sizes are over-estimated when based on linear-regression association coefficients. We estimate the polygenicity of schizophrenia to be 0.037 and the putamen to be 0.001, while the respective sample sizes

  4. A Method to Estimate the Probability that any Individual Cloud-to-Ground Lightning Stroke was Within any Radius of any Point

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.; Roeder, William P.; Merceret, Francis J.

    2011-01-01

    A new technique has been developed to estimate the probability that a nearby cloud to ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  5. A Method to Estimate the Probability That Any Individual Cloud-to-Ground Lightning Stroke Was Within Any Radius of Any Point

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.; Roeder, William P.; Merceret, Francis J.

    2010-01-01

    A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station.

  6. Launch Risk Acceptability Considering Uncertainty in Risk Estimates

    NASA Astrophysics Data System (ADS)

    Collins, J. D.; Carbon, S. L.

    2010-09-01

    Quantification of launch risk is difficult and uncertain due to the assumptions made in the modeling process and the difficulty in developing the supporting data. This means that estimates of the risks are uncertain and the decision maker must decide on the acceptability of the launch under uncertainty. This paper describes the process to quantify the uncertainty and, in the process, describes the separate roles of aleatory and epistemic uncertainty in obtaining the point estimate of the casualty expectation and, ultimately, the distribution of the uncertainty in the computed casualty expectation. Tables are included of the significant sources and the nature of the contributing uncertainties. In addition, general procedures and an example are also included to describe the computational procedure. The second part of the paper discusses how the quantified uncertainty should be applied to the decision-making process. This discussion describes the procedure proposed and adopted by the Risk Committee of the Range Commander’s Council Range Safety Group which will be published in RCC 321-10 [1].

  7. Estimated probabilities and volumes of postwildfire debris flows, a prewildfire evaluation for the upper Blue River watershed, Summit County, Colorado

    USGS Publications Warehouse

    Elliott, John G.; Flynn, Jennifer L.; Bossong, Clifford R.; Char, Stephen J.

    2011-01-01

    The subwatersheds with the greatest potential postwildfire and postprecipitation hazards are those with both high probabilities of debris-flow occurrence and large estimated volumes of debris-flow material. The high probabilities of postwildfire debris flows, the associated large estimated debris-flow volumes, and the densely populated areas along the creeks and near the outlets of the primary watersheds indicate that Indiana, Pennsylvania, and Spruce Creeks are associated with a relatively high combined debris-flow hazard.

  8. Stress testing for risk stratification of patients with low to moderate probability of acute cardiac ischemia.

    PubMed

    Chandra, A; Rudraiah, L; Zalenski, R J

    2001-02-01

    In summary, this article focused on the use of stress testing to risk-stratify patients at the conclusion of their emergency evaluation for ACI. As discussed, those patients in the probably not ACI category require additional risk stratification prior to discharge. It should be kept in mind that patients in this category are heterogeneous, containing subgroups at both higher and lower risk of ACI and cardiac events. The patients with lower pretest probability for ACI may only need exercise testing in the ED. Patients with higher pretest probability should undergo myocardial perfusion or echocardiographic stress testing to maximize diagnostic and prognostic information. Prognostic information is the key to provocative testing in the ED. Prognostic information is the component that will help emergency physicians identify the patients who may be discharged home safely without having to worry about a 6% annual cardiac death rate and a 10% overall death rate over the next 30 months. Stress testing provides this key prognostic data, and it can be obtained in short-stay chest pain observation units in a safe, timely, and cost-effective fashion. PMID:11214405

  9. Estimating site occupancy rates for aquatic plants using spatial sub-sampling designs when detection probabilities are less than one

    USGS Publications Warehouse

    Nielson, Ryan M.; Gray, Brian R.; McDonald, Lyman L.; Heglund, Patricia J.

    2011-01-01

    Estimation of site occupancy rates when detection probabilities are <1 is well established in wildlife science. Data from multiple visits to a sample of sites are used to estimate detection probabilities and the proportion of sites occupied by focal species. In this article we describe how site occupancy methods can be applied to estimate occupancy rates of plants and other sessile organisms. We illustrate this approach and the pitfalls of ignoring incomplete detection using spatial data for 2 aquatic vascular plants collected under the Upper Mississippi River's Long Term Resource Monitoring Program (LTRMP). Site occupancy models considered include: a naïve model that ignores incomplete detection, a simple site occupancy model assuming a constant occupancy rate and a constant probability of detection across sites, several models that allow site occupancy rates and probabilities of detection to vary with habitat characteristics, and mixture models that allow for unexplained variation in detection probabilities. We used information theoretic methods to rank competing models and bootstrapping to evaluate the goodness-of-fit of the final models. Results of our analysis confirm that ignoring incomplete detection can result in biased estimates of occupancy rates. Estimates of site occupancy rates for 2 aquatic plant species were 19–36% higher compared to naive estimates that ignored probabilities of detection <1. Simulations indicate that final models have little bias when 50 or more sites are sampled, and little gains in precision could be expected for sample sizes >300. We recommend applying site occupancy methods for monitoring presence of aquatic species.

  10. Estimating Risk of Alcohol Dependence Using Alcohol Screening Scores*

    PubMed Central

    Rubinsky, Anna D.; Kivlahan, Daniel R.; Volk, Robert J.; Maynard, Charles; Bradley, Katharine A.

    2010-01-01

    Brief alcohol counseling interventions can reduce alcohol consumption and related morbidity among non-dependent risky drinkers, but more intensive alcohol treatment is recommended for persons with alcohol dependence. This study evaluated whether scores on common alcohol screening tests could identify patients likely to have current alcohol dependence so that more appropriate follow-up assessment and/or intervention could be offered. This cross-sectional study used secondary data from 392 male and 927 female adult family medicine outpatients (1993–1994). Likelihood ratios were used to empirically identify and evaluate ranges of scores of the AUDIT, the AUDIT-C, two single-item questions about frequency of binge drinking, and the CAGE questionnaire for detecting DSM-IV past-year alcohol dependence. Based on the prevalence of past-year alcohol dependence in this sample (men: 12.2%; women: 5.8%), zones of the AUDIT and AUDIT-C identified wide variability in the post-screening risk of alcohol dependence in men and women, even among those who screened positive for alcohol misuse. Among men, AUDIT zones 5–10, 11–14 and 15–40 were associated with post-screening probabilities of past-year alcohol dependence ranging from 18–87%, and AUDIT-C zones 5–6, 7–9 and 10–12 were associated with probabilities ranging from 22–75%. Among women, AUDIT zones 3–4, 5–8, 9–12 and 13–40 were associated with post-screening probabilities of past-year alcohol dependence ranging from 6–94%, and AUDIT-C zones 3, 4–6, 7–9 and 10–12 were associated with probabilities ranging from 9–88%. AUDIT or AUDIT-C scores could be used to estimate the probability of past-year alcohol dependence among patients who screen positive for alcohol misuse and inform clinical decision-making. PMID:20042299

  11. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  12. A logistic regression equation for estimating the probability of a stream flowing perennially in Massachusetts

    USGS Publications Warehouse

    Bent, Gardner C.; Archfield, Stacey A.

    2002-01-01

    A logistic regression equation was developed for estimating the probability of a stream flowing perennially at a specific site in Massachusetts. The equation provides city and town conservation commissions and the Massachusetts Department of Environmental Protection with an additional method for assessing whether streams are perennial or intermittent at a specific site in Massachusetts. This information is needed to assist these environmental agencies, who administer the Commonwealth of Massachusetts Rivers Protection Act of 1996, which establishes a 200-foot-wide protected riverfront area extending along the length of each side of the stream from the mean annual high-water line along each side of perennial streams, with exceptions in some urban areas. The equation was developed by relating the verified perennial or intermittent status of a stream site to selected basin characteristics of naturally flowing streams (no regulation by dams, surface-water withdrawals, ground-water withdrawals, diversion, waste-water discharge, and so forth) in Massachusetts. Stream sites used in the analysis were identified as perennial or intermittent on the basis of review of measured streamflow at sites throughout Massachusetts and on visual observation at sites in the South Coastal Basin, southeastern Massachusetts. Measured or observed zero flow(s) during months of extended drought as defined by the 310 Code of Massachusetts Regulations (CMR) 10.58(2)(a) were not considered when designating the perennial or intermittent status of a stream site. The database used to develop the equation included a total of 305 stream sites (84 intermittent- and 89 perennial-stream sites in the State, and 50 intermittent- and 82 perennial-stream sites in the South Coastal Basin). Stream sites included in the database had drainage areas that ranged from 0.14 to 8.94 square miles in the State and from 0.02 to 7.00 square miles in the South Coastal Basin.Results of the logistic regression analysis

  13. A methodology to estimate probability of occurrence of floods using principal component analysis

    NASA Astrophysics Data System (ADS)

    castro Heredia, L. M.; Gironas, J. A.

    2014-12-01

    Flood events and debris flows are characterized by a very rapid response of basins to precipitation, often resulting in loss of life and property damage. Complex topography with steep slopes and narrow valleys increase the likelihood of having these events. An early warning system (EWS) is a tool that allows anticipating a hazardous event, which in turns provides time for an early response to reduce negative impacts. These EWS's can rely on very powerful and computer-demanding models to predict flow discharges and inundation zones, which require data typically unavailable. Instead, simpler EWŚs based on a statistical analysis of observed hydro-meteorological data could be a good alternative. In this work we propose a methodology for estimating the probability of exceedance of maximum flowdischarges using principal components analysis (PCA). In the method we first perform a spatio-temporal cross-correlation analysis between extreme flows data and daily meteorological records for the last 15 days prior to the day of the flood event. We then use PCA to create synthetic variables which are representative of the meteorological variables associated with the flood event (i.e. cumulative rainfall and minimum temperature). Finally, we developed a model to explain the probability of exceedance using the principal components. The methodology was applied to a basin in the foothill area of Santiago, Chile, for which all the extreme events between 1970 and 2013 were analyzed.Results show that elevation rather than distance or location within the contributing basin is what mainly explains the statistical correlation between meteorologicalrecords and flood events. Two principal components were found that explain more than 90% of the total variance of the accumulated rainfalls and minimum temperatures. One component was formed with cumulative rainfall from 3 to 15 days prior to the event, whereas the other one was formed with the minimum temperatures for the last 2 days preceding

  14. Assessing Categorization Performance at the Individual Level: A Comparison of Monte Carlo Simulation and Probability Estimate Model Procedures

    PubMed Central

    Arterberry, Martha E.; Bornstein, Marc H.; Haynes, O. Maurice

    2012-01-01

    Two analytical procedures for identifying young children as categorizers, the Monte Carlo Simulation and the Probability Estimate Model, were compared. Using a sequential touching method, children age 12, 18, 24, and 30 months were given seven object sets representing different levels of categorical classification. From their touching performance, the probability that children were categorizing was then determined independently using Monte Carlo Simulation and the Probability Estimate Model. The two analytical procedures resulted in different percentages of children being classified as categorizers. Results using the Monte Carlo Simulation were more consistent with group-level analyses than results using the Probability Estimate Model. These findings recommend using the Monte Carlo Simulation for determining individual categorizer classification. PMID:21402410

  15. Seismic Risk Assessment and Loss Estimation for Tbilisi City

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Alania, Victor; Varazanashvili, Otar; Gugeshashvili, Tengiz; Arabidze, Vakhtang; Arevadze, Nika; Tsereteli, Emili; Gaphrindashvili, Giorgi; Gventcadze, Alexander; Goguadze, Nino; Vephkhvadze, Sophio

    2013-04-01

    The proper assessment of seismic risk is of crucial importance for society protection and city sustainable economic development, as it is the essential part to seismic hazard reduction. Estimation of seismic risk and losses is complicated tasks. There is always knowledge deficiency on real seismic hazard, local site effects, inventory on elements at risk, infrastructure vulnerability, especially for developing countries. Lately great efforts was done in the frame of EMME (earthquake Model for Middle East Region) project, where in the work packages WP1, WP2 , WP3 and WP4 where improved gaps related to seismic hazard assessment and vulnerability analysis. Finely in the frame of work package wp5 "City Scenario" additional work to this direction and detail investigation of local site conditions, active fault (3D) beneath Tbilisi were done. For estimation economic losses the algorithm was prepared taking into account obtained inventory. The long term usage of building is very complex. It relates to the reliability and durability of buildings. The long term usage and durability of a building is determined by the concept of depreciation. Depreciation of an entire building is calculated by summing the products of individual construction unit' depreciation rates and the corresponding value of these units within the building. This method of calculation is based on an assumption that depreciation is proportional to the building's (constructions) useful life. We used this methodology to create a matrix, which provides a way to evaluate the depreciation rates of buildings with different type and construction period and to determine their corresponding value. Finally loss was estimated resulting from shaking 10%, 5% and 2% exceedance probability in 50 years. Loss resulting from scenario earthquake (earthquake with possible maximum magnitude) also where estimated.

  16. IMPROVED RISK ESTIMATES FOR CARBON TETRACHLORIDE

    SciTech Connect

    Benson, Janet M.; Springer, David L.

    1999-12-31

    Carbon tetrachloride has been used extensively within the DOE nuclear weapons facilities. Rocky Flats was formerly the largest volume consumer of CCl4 in the United States using 5000 gallons in 1977 alone (Ripple, 1992). At the Hanford site, several hundred thousand gallons of CCl4 were discharged between 1955 and 1973 into underground cribs for storage. Levels of CCl4 in groundwater at highly contaminated sites at the Hanford facility have exceeded 8 the drinking water standard of 5 ppb by several orders of magnitude (Illman, 1993). High levels of CCl4 at these facilities represent a potential health hazard for workers conducting cleanup operations and for surrounding communities. The level of CCl4 cleanup required at these sites and associated costs are driven by current human health risk estimates, which assume that CCl4 is a genotoxic carcinogen. The overall purpose of these studies was to improve the scientific basis for assessing the health risk associated with human exposure to CCl4. Specific research objectives of this project were to: (1) compare the rates of CCl4 metabolism by rats, mice and hamsters in vivo and extrapolate those rates to man based on parallel studies on the metabolism of CCl4 by rat, mouse, hamster and human hepatic microsomes in vitro; (2) using hepatic microsome preparations, determine the role of specific cytochrome P450 isoforms in CCl4-mediated toxicity and the effects of repeated inhalation and ingestion of CCl4 on these isoforms; and (3) evaluate the toxicokinetics of inhaled CCl4 in rats, mice and hamsters. This information has been used to improve the physiologically based pharmacokinetic (PBPK) model for CCl4 originally developed by Paustenbach et al. (1988) and more recently revised by Thrall and Kenny (1996). Another major objective of the project was to provide scientific evidence that CCl4, like chloroform, is a hepatocarcinogen only when exposure results in cell damage, cell killing and regenerative proliferation. In

  17. A probability model for evaluating the bias and precision of influenza vaccine effectiveness estimates from case-control studies.

    PubMed

    Haber, M; An, Q; Foppa, I M; Shay, D K; Ferdinands, J M; Orenstein, W A

    2015-05-01

    As influenza vaccination is now widely recommended, randomized clinical trials are no longer ethical in many populations. Therefore, observational studies on patients seeking medical care for acute respiratory illnesses (ARIs) are a popular option for estimating influenza vaccine effectiveness (VE). We developed a probability model for evaluating and comparing bias and precision of estimates of VE against symptomatic influenza from two commonly used case-control study designs: the test-negative design and the traditional case-control design. We show that when vaccination does not affect the probability of developing non-influenza ARI then VE estimates from test-negative design studies are unbiased even if vaccinees and non-vaccinees have different probabilities of seeking medical care against ARI, as long as the ratio of these probabilities is the same for illnesses resulting from influenza and non-influenza infections. Our numerical results suggest that in general, estimates from the test-negative design have smaller bias compared to estimates from the traditional case-control design as long as the probability of non-influenza ARI is similar among vaccinated and unvaccinated individuals. We did not find consistent differences between the standard errors of the estimates from the two study designs. PMID:25147970

  18. Estimation of peak discharge quantiles for selected annual exceedance probabilities in northeastern Illinois

    USGS Publications Warehouse

    Over, Thomas; Saito, Riki J.; Veilleux, Andrea; Sharpe, Jennifer B.; Soong, David; Ishii, Audrey

    2016-01-01

    This report provides two sets of equations for estimating peak discharge quantiles at annual exceedance probabilities (AEPs) of 0.50, 0.20, 0.10, 0.04, 0.02, 0.01, 0.005, and 0.002 (recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively) for watersheds in Illinois based on annual maximum peak discharge data from 117 watersheds in and near northeastern Illinois. One set of equations was developed through a temporal analysis with a two-step least squares-quantile regression technique that measures the average effect of changes in the urbanization of the watersheds used in the study. The resulting equations can be used to adjust rural peak discharge quantiles for the effect of urbanization, and in this study the equations also were used to adjust the annual maximum peak discharges from the study watersheds to 2010 urbanization conditions.The other set of equations was developed by a spatial analysis. This analysis used generalized least-squares regression to fit the peak discharge quantiles computed from the urbanization-adjusted annual maximum peak discharges from the study watersheds to drainage-basin characteristics. The peak discharge quantiles were computed by using the Expected Moments Algorithm following the removal of potentially influential low floods defined by a multiple Grubbs-Beck test. To improve the quantile estimates, generalized skew coefficients were obtained from a newly developed regional skew model in which the skew increases with the urbanized land use fraction. The drainage-basin characteristics used as explanatory variables in the spatial analysis include drainage area, the fraction of developed land, the fraction of land with poorly drained soils or likely water, and the basin slope estimated as the ratio of the basin relief to basin perimeter.This report also provides the following: (1) examples to illustrate the use of the spatial and urbanization-adjustment equations for estimating peak discharge quantiles at

  19. Estimation of the Probable Maximum Flood for a Small Lowland River in Poland

    NASA Astrophysics Data System (ADS)

    Banasik, K.; Hejduk, L.

    2009-04-01

    The planning, designe and use of hydrotechnical structures often requires the assesment of maximu flood potentials. The most common term applied to this upper limit of flooding is the probable maximum flood (PMF). The PMP/UH (probable maximum precipitation/unit hydrograph) method has been used in the study to predict PMF from a small agricultural lowland river basin of Zagozdzonka (left tributary of Vistula river) in Poland. The river basin, located about 100 km south of Warsaw, with an area - upstream the gauge of Plachty - of 82 km2, has been investigated by Department of Water Engineering and Environmenal Restoration of Warsaw University of Life Sciences - SGGW since 1962. Over 40-year flow record was used in previous investigation for predicting T-year flood discharge (Banasik et al., 2003). The objective here was to estimate the PMF using the PMP/UH method and to compare the results with the 100-year flood. A new relation of depth-duration curve of PMP for the local climatic condition has been developed based on Polish maximum observed rainfall data (Ozga-Zielinska & Ozga-Zielinski, 2003). Exponential formula, with the value of exponent of 0.47, i.e. close to the exponent in formula for world PMP and also in the formula of PMP for Great Britain (Wilson, 1993), gives the rainfall depth about 40% lower than the Wilson's one. The effective rainfall (runoff volume) has been estimated from the PMP of various duration using the CN-method (USDA-SCS, 1986). The CN value as well as parameters of the IUH model (Nash, 1957) have been established from the 27 rainfall-runoff events, recorded in the river basin in the period 1980-2004. Varibility of the parameter values with the size of the events will be discussed in the paper. The results of the analyse have shown that the peak discharge of the PMF is 4.5 times larger then 100-year flood, and volume ratio of the respective direct hydrographs caused by rainfall events of critical duration is 4.0. References 1.Banasik K

  20. Estimation of the Probable Maximum Flood for a Small Lowland River in Poland

    NASA Astrophysics Data System (ADS)

    Banasik, K.; Hejduk, L.

    2009-04-01

    The planning, designe and use of hydrotechnical structures often requires the assesment of maximu flood potentials. The most common term applied to this upper limit of flooding is the probable maximum flood (PMF). The PMP/UH (probable maximum precipitation/unit hydrograph) method has been used in the study to predict PMF from a small agricultural lowland river basin of Zagozdzonka (left tributary of Vistula river) in Poland. The river basin, located about 100 km south of Warsaw, with an area - upstream the gauge of Plachty - of 82 km2, has been investigated by Department of Water Engineering and Environmenal Restoration of Warsaw University of Life Sciences - SGGW since 1962. Over 40-year flow record was used in previous investigation for predicting T-year flood discharge (Banasik et al., 2003). The objective here was to estimate the PMF using the PMP/UH method and to compare the results with the 100-year flood. A new relation of depth-duration curve of PMP for the local climatic condition has been developed based on Polish maximum observed rainfall data (Ozga-Zielinska & Ozga-Zielinski, 2003). Exponential formula, with the value of exponent of 0.47, i.e. close to the exponent in formula for world PMP and also in the formula of PMP for Great Britain (Wilson, 1993), gives the rainfall depth about 40% lower than the Wilson's one. The effective rainfall (runoff volume) has been estimated from the PMP of various duration using the CN-method (USDA-SCS, 1986). The CN value as well as parameters of the IUH model (Nash, 1957) have been established from the 27 rainfall-runoff events, recorded in the river basin in the period 1980-2004. Varibility of the parameter values with the size of the events will be discussed in the paper. The results of the analyse have shown that the peak discharge of the PMF is 4.5 times larger then 100-year flood, and volume ratio of the respective direct hydrographs caused by rainfall events of critical duration is 4.0. References 1.Banasik K

  1. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    PubMed

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). PMID:26709414

  2. Moment-Based Probability Modeling and Extreme Response Estimation, The FITS Routine Version 1.2

    SciTech Connect

    MANUEL,LANCE; KASHEF,TINA; WINTERSTEIN,STEVEN R.

    1999-11-01

    This report documents the use of the FITS routine, which provides automated fits of various analytical, commonly used probability models from input data. It is intended to complement the previously distributed FITTING routine documented in RMS Report 14 (Winterstein et al., 1994), which implements relatively complex four-moment distribution models whose parameters are fit with numerical optimization routines. Although these four-moment fits can be quite useful and faithful to the observed data, their complexity can make them difficult to automate within standard fitting algorithms. In contrast, FITS provides more robust (lower moment) fits of simpler, more conventional distribution forms. For each database of interest, the routine estimates the distribution of annual maximum response based on the data values and the duration, T, over which they were recorded. To focus on the upper tails of interest, the user can also supply an arbitrary lower-bound threshold, {chi}{sub low}, above which a shifted distribution model--exponential or Weibull--is fit.

  3. A New Approach to Estimating the Probability for β-delayed Neutron Emission

    SciTech Connect

    McCutchan, E.A.; Sonzogni, A.A.; Johnson, T.D.; Abriola, D.; Birch, M.; Singh, B.

    2014-06-15

    The probability for neutron emission following β decay, Pn, is a crucial property for a wide range of physics and applications including nuclear structure, r-process nucleosynthesis, the control of nuclear reactors, and the post-processing of nuclear fuel. Despite much experimental effort, knowledge of Pn values is still lacking in very neutron-rich nuclei, requiring predictions from either systematics or theoretical models. Traditionally, systematic predictions were made by investigating the Pn value as a function of the decay Q value and the neutron separation energy in the daughter nucleus. A new approach to Pn systematics is presented which incorporates the half-life of the decay and the Q value for β-delayed neutron emission. This prescription correlates the known data better, and thus improves the estimation of Pn values for neutron-rich nuclei. Such an approach can be applied to generate input values for r-process network calculations or in the modeling of advanced fuel cycles.

  4. Peers Increase Adolescent Risk Taking Even When the Probabilities of Negative Outcomes Are Known

    PubMed Central

    Smith, Ashley R.; Chein, Jason; Steinberg, Laurence

    2015-01-01

    The majority of adolescent risk taking occurs in the presence of peers, and recent research suggests that the presence of peers may alter how the potential rewards and costs of a decision are valuated or perceived. The current study further explores this notion by investigating how peer observation affects adolescent risk taking when the information necessary to make an informed decision is explicitly provided. We used a novel probabilistic gambling task in which participants decided whether to play or pass on a series of offers for which the reward and loss outcome probabilities were made explicit. Adolescent participants completed the task either alone or under the belief that they were being observed by an unknown peer in a neighboring room. Participants who believed a peer was observing them chose to gamble more often than participants who completed the task alone, and this effect was most evident for decisions with a greater probability of loss. These results suggest that the presence of peers can increase risk taking among adolescents even when specific information regarding the likelihood of positive and negative outcomes is provided. The findings expand our understanding of how peers influence adolescent decision making and have important implications regarding the value of educational programs aimed at reducing risky behaviors during adolescence. PMID:24447118

  5. Covariate adjustment of cumulative incidence functions for competing risks data using inverse probability of treatment weighting.

    PubMed

    Neumann, Anke; Billionnet, Cécile

    2016-06-01

    In observational studies without random assignment of the treatment, the unadjusted comparison between treatment groups may be misleading due to confounding. One method to adjust for measured confounders is inverse probability of treatment weighting. This method can also be used in the analysis of time to event data with competing risks. Competing risks arise if for some individuals the event of interest is precluded by a different type of event occurring before, or if only the earliest of several times to event, corresponding to different event types, is observed or is of interest. In the presence of competing risks, time to event data are often characterized by cumulative incidence functions, one for each event type of interest. We describe the use of inverse probability of treatment weighting to create adjusted cumulative incidence functions. This method is equivalent to direct standardization when the weight model is saturated. No assumptions about the form of the cumulative incidence functions are required. The method allows studying associations between treatment and the different types of event under study, while focusing on the earliest event only. We present a SAS macro implementing this method and we provide a worked example. PMID:27084321

  6. The Human Bathtub: Safety and Risk Predictions Including the Dynamic Probability of Operator Errors

    SciTech Connect

    Duffey, Romney B.; Saull, John W.

    2006-07-01

    Reactor safety and risk are dominated by the potential and major contribution for human error in the design, operation, control, management, regulation and maintenance of the plant, and hence to all accidents. Given the possibility of accidents and errors, now we need to determine the outcome (error) probability, or the chance of failure. Conventionally, reliability engineering is associated with the failure rate of components, or systems, or mechanisms, not of human beings in and interacting with a technological system. The probability of failure requires a prior knowledge of the total number of outcomes, which for any predictive purposes we do not know or have. Analysis of failure rates due to human error and the rate of learning allow a new determination of the dynamic human error rate in technological systems, consistent with and derived from the available world data. The basis for the analysis is the 'learning hypothesis' that humans learn from experience, and consequently the accumulated experience defines the failure rate. A new 'best' equation has been derived for the human error, outcome or failure rate, which allows for calculation and prediction of the probability of human error. We also provide comparisons to the empirical Weibull parameter fitting used in and by conventional reliability engineering and probabilistic safety analysis methods. These new analyses show that arbitrary Weibull fitting parameters and typical empirical hazard function techniques cannot be used to predict the dynamics of human errors and outcomes in the presence of learning. Comparisons of these new insights show agreement with human error data from the world's commercial airlines, the two shuttle failures, and from nuclear plant operator actions and transient control behavior observed in transients in both plants and simulators. The results demonstrate that the human error probability (HEP) is dynamic, and that it may be predicted using the learning hypothesis and the minimum

  7. Probability Mapping to Determine the Spatial Risk Pattern of Acute Gastroenteritis in Coimbatore District, India, Using Geographic Information Systems (GIS)

    PubMed Central

    Joseph, Pawlin Vasanthi; Balan, Brindha; Rajendran, Vidhyalakshmi; Prashanthi, Devi Marimuthu; Somnathan, Balasubramanian

    2015-01-01

    Background: Maps show well the spatial configuration of information. Considerable effort is devoted to the development of geographical information systems (GIS) that increase understanding of public health problems and in particular to collaborate efforts among clinicians, epidemiologists, ecologists, and geographers to map and forecast disease risk. Objectives: Small populations tend to give rise to the most extreme disease rates, even if the actual rates are similar across the areas. Such situations will follow the decision-maker's attention on these areas when they scrutinize the map for decision making or resource allocation. As an alternative, maps can be prepared using P-values (probabilistic values). Materials and Methods: The statistical significance of rates rather than the rates themselves are used to map the results. The incidence rates calculated for each village from 2000 to 2009 is used to estimate λ, the expected number of cases in the study area. The obtained results are mapped using Arc GIS 10.0. Results: The likelihood of infections from low to high is depicted in the map and it is observed that five villages namely, Odanthurai, Coimbatore Corporation, Ikkaraiboluvampatti, Puliakulam, and Pollachi Corporation are more likely to have significantly high incidences. Conclusion: In the probability map, some of the areas with exceptionally high or low rates disappear. These are typically small unpopulated areas, whose rates are unstable due to the small numbers problem. The probability map shows more specific regions of relative risks and expected outcomes. PMID:26170544

  8. Total Probability of Collision as a Metric for Finite Conjunction Assessment and Collision Risk Management

    NASA Technical Reports Server (NTRS)

    Frigm, Ryan C.; Hejduk, Matthew D.; Johnson, Lauren C.; Plakalovic, Dragan

    2015-01-01

    On-orbit collision risk is becoming an increasing mission risk to all operational satellites in Earth orbit. Managing this risk can be disruptive to mission and operations, present challenges for decision-makers, and is time-consuming for all parties involved. With the planned capability improvements to detecting and tracking smaller orbital debris and capacity improvements to routinely predict on-orbit conjunctions, this mission risk will continue to grow in terms of likelihood and effort. It is very real possibility that the future space environment will not allow collision risk management and mission operations to be conducted in the same manner as it is today. This paper presents the concept of a finite conjunction assessment-one where each discrete conjunction is not treated separately but, rather, as a continuous event that must be managed concurrently. The paper also introduces the Total Probability of Collision as an analogous metric for finite conjunction assessment operations and provides several options for its usage in a Concept of Operations.

  9. Survivorship models for estimating the risk of decompression sickness.

    PubMed

    Kumar, K V; Powell, M R

    1994-07-01

    Several approaches have been used for modeling the incidence of decompression sickness (DCS) such as Hill's dose-response and logistic regression. Most of these methods do not include the time-to-onset information in the model. Survival analysis (failure time analysis) is appropriate when the time to onset of an event is of interest. The applicability of survival analysis for modeling the risk of DCS is illustrated by using data obtained from hypobaric chamber exposures simulating extravehicular activities (n = 426). Univariate analysis of incidence-free survival proportions were obtained for Doppler-detectable circulating microbubbles (CMB), symptoms of DCS and test aborts. A log-linear failure time regression model with 360-min half-time tissue ratio (TR) as covariate was constructed, and estimated probabilities for various TR values were calculated. Further regression analysis by including CMB status in this model showed significant improvement (p < 0.05) in the estimation of DCS over the previous model. Since DCS is dependent on the exposure pressure as well as the duration of exposure, we recommend the use of survival analysis for modeling the risk of DCS. PMID:7945136

  10. Nonparametric estimation of transition probabilities in the non-Markov illness-death model: A comparative study.

    PubMed

    de Uña-Álvarez, Jacobo; Meira-Machado, Luís

    2015-06-01

    Multi-state models are often used for modeling complex event history data. In these models the estimation of the transition probabilities is of particular interest, since they allow for long-term predictions of the process. These quantities have been traditionally estimated by the Aalen-Johansen estimator, which is consistent if the process is Markov. Several non-Markov estimators have been proposed in the recent literature, and their superiority with respect to the Aalen-Johansen estimator has been proved in situations in which the Markov condition is strongly violated. However, the existing estimators have the drawback of requiring that the support of the censoring distribution contains the support of the lifetime distribution, which is not often the case. In this article, we propose two new methods for estimating the transition probabilities in the progressive illness-death model. Some asymptotic results are derived. The proposed estimators are consistent regardless the Markov condition and the referred assumption about the censoring support. We explore the finite sample behavior of the estimators through simulations. The main conclusion of this piece of research is that the proposed estimators are much more efficient than the existing non-Markov estimators in most cases. An application to a clinical trial on colon cancer is included. Extensions to progressive processes beyond the three-state illness-death model are discussed. PMID:25735883

  11. From default probabilities to credit spreads: credit risk models explain market prices (Keynote Address)

    NASA Astrophysics Data System (ADS)

    Denzler, Stefan M.; Dacorogna, Michel M.; Muller, Ulrich A.; McNeil, Alexander J.

    2005-05-01

    Credit risk models like Moody's KMV are now well established in the market and give bond managers reliable default probabilities for individual firms. Until now it has been hard to relate those probabilities to the actual credit spreads observed on the market for corporate bonds. Inspired by the existence of scaling laws in financial markets by Dacorogna et al. 2001 and DiMatteo et al. 2005 deviating from the Gaussian behavior, we develop a model that quantitatively links those default probabilities to credit spreads (market prices). The main input quantities to this study are merely industry yield data of different times to maturity and expected default frequencies (EDFs) of Moody's KMV. The empirical results of this paper clearly indicate that the model can be used to calculate approximate credit spreads (market prices) from EDFs, independent of the time to maturity and the industry sector under consideration. Moreover, the model is effective in an out-of-sample setting, it produces consistent results on the European bond market where data are scarce and can be adequately used to approximate credit spreads on the corporate level.

  12. Uncertainty of Calculated Risk Estimates for Secondary Malignancies After Radiotherapy

    SciTech Connect

    Kry, Stephen F. . E-mail: sfkry@mdanderson.org; Followill, David; White, R. Allen; Stovall, Marilyn; Kuban, Deborah A.; Salehpour, Mohammad

    2007-07-15

    Purpose: The significance of risk estimates for fatal secondary malignancies caused by out-of-field radiation exposure remains unresolved because the uncertainty in calculated risk estimates has not been established. This work examines the uncertainty in absolute risk estimates and in the ratio of risk estimates between different treatment modalities. Methods and Materials: Clinically reasonable out-of-field doses and calculated risk estimates were taken from the literature for several prostate treatment modalities, including intensity-modulated radiotherapy (IMRT), and were recalculated using the most recent risk model. The uncertainties in this risk model and uncertainties in the linearity of the dose-response model were considered in generating 90% confidence intervals for the uncertainty in the absolute risk estimates and in the ratio of the risk estimates. Results: The absolute risk estimates of fatal secondary malignancy were associated with very large uncertainties, which precluded distinctions between the risks associated with the different treatment modalities considered. However, a much smaller confidence interval exists for the ratio of risk estimates, and this ratio between different treatment modalities may be statistically significant when there is an effective dose equivalent difference of at least 50%. Such a difference may exist between clinically reasonable treatment options, including 6-MV IMRT versus 18-MV IMRT for prostate therapy. Conclusion: The ratio of the risk between different treatment modalities may be significantly different. Consequently risk models and associated risk estimates may be useful and meaningful for evaluating different treatment options. The calculated risk of secondary malignancy should be considered in the selection of an optimal treatment plan.

  13. Developing a Methodology for Eliciting Subjective Probability Estimates During Expert Evaluations of Safety Interventions: Application for Bayesian Belief Networks

    NASA Technical Reports Server (NTRS)

    Wiegmann, Douglas A.a

    2005-01-01

    The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.

  14. Probable Health Risks Due to Exposure to Outdoor PM2.5 in India

    NASA Astrophysics Data System (ADS)

    Dey, S.; Chowdhury, S.

    2014-12-01

    Particulate matter of size <2.5 μm (commonly referred to as PM2.5) is considered to be the best indicator of health risks due to exposure to particulate pollution. Unlike the decreasing trends in the developed countries, aerosol loading continues to increase over the Indian subcontinent in the recent past, exposing ~1.6 billion population at risk. Lack of direct measurements prompted us to utilize satellite data in establishing a robust long-term database of surface PM2.5 at high spatial resolution. The hybrid approach utilizes a chemical transport model to constrain the relation between columnar aerosol optical depth (AOD) and surface PM2.5 and establish mean monthly conversion factor. Satellite-derived daily AODs for the period 2000-2012 are then converted to PM2.5 using the conversion factors. The dataset (after validation against coincident in-situ measurements and bias-correction) was used to carry out the exposure assessment. 51% of the population is exposed to PM2.5 concentration exceeding WHO air quality interim target-3 threshold (35 μg m-3). The health impacts are categorized in terms of four diseases - cardio ortho-pulmonary disease (COPD), stroke, ischemic heart disease (IHD) and lung cancer (LC). In absence of any region-specific cohort study, published studies are consulted to estimate risk. The risks relative to the background concentration of 10 μg m-3 are estimated by logarithmic fitting of the individual cohort studies against the corresponding PM2.5 concentration. This approach considers multiple (>100) cohort studies across a wide variety of adult population from various socio-economic backgrounds. Therefore, the calculated risks are considered to be better estimates in relative to any one particular type of risk function model (e.g. linear 50 or linear 70 or exponential). The risk values are used to calculate the additional mortality due to exposure to PM2.5 in each of the administrative districts in India to identify the vulnerable regions

  15. [Estimation of the radiation risk of determined effects of human exposure in space].

    PubMed

    Petrov, V M; Vasina, Iu I; Vlasov, A G; Shurshakov, V A

    2001-01-01

    Subject of the paper is possibility to estimate radiation risk of determined consequences of exposure to solar space rays as a probability of violation of established dose limits. Analysis of specifies of spacecrew exposure to solar space rays in a long-term, particularly interplanetary mission suggests that immediate introduction of the principle in the radiation health policy can result in serious errors, mainly exaggeration, in determination of radiation risk. Proposed are approaches to radiation risk estimation with consideration of the specific of human exposure in space. PMID:11915752

  16. Structured Coupling of Probability Loss Distributions: Assessing Joint Flood Risk in Multiple River Basins.

    PubMed

    Timonina, Anna; Hochrainer-Stigler, Stefan; Pflug, Georg; Jongman, Brenden; Rojas, Rodrigo

    2015-11-01

    Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large-scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards. PMID:26010101

  17. Inferring rare disease risk variants based on exact probabilities of sharing by multiple affected relatives

    PubMed Central

    Bureau, Alexandre; Younkin, Samuel G.; Parker, Margaret M.; Bailey-Wilson, Joan E.; Marazita, Mary L.; Murray, Jeffrey C.; Mangold, Elisabeth; Albacha-Hejazi, Hasan; Beaty, Terri H.; Ruczinski, Ingo

    2014-01-01

    Motivation: Family-based designs are regaining popularity for genomic sequencing studies because they provide a way to test cosegregation with disease of variants that are too rare in the population to be tested individually in a conventional case–control study. Results: Where only a few affected subjects per family are sequenced, the probability that any variant would be shared by all affected relatives—given it occurred in any one family member—provides evidence against the null hypothesis of a complete absence of linkage and association. A P-value can be obtained as the sum of the probabilities of sharing events as (or more) extreme in one or more families. We generalize an existing closed-form expression for exact sharing probabilities to more than two relatives per family. When pedigree founders are related, we show that an approximation of sharing probabilities based on empirical estimates of kinship among founders obtained from genome-wide marker data is accurate for low levels of kinship. We also propose a more generally applicable approach based on Monte Carlo simulations. We applied this method to a study of 55 multiplex families with apparent non-syndromic forms of oral clefts from four distinct populations, with whole exome sequences available for two or three affected members per family. The rare single nucleotide variant rs149253049 in ADAMTS9 shared by affected relatives in three Indian families achieved significance after correcting for multiple comparisons (p=2×10−6). Availability and implementation: Source code and binaries of the R package RVsharing are freely available for download at http://cran.r-project.org/web/packages/RVsharing/index.html. Contact: alexandre.bureau@msp.ulaval.ca or ingo@jhu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24740360

  18. Probability weighted moments compared with some traditional techniques in estimating Gumbel parameters and quantiles.

    USGS Publications Warehouse

    Landwehr, J.M.; Matalas, N.C.; Wallis, J.R.

    1979-01-01

    Results were derived from Monte Carlo experiments by using both independent and serially correlated Gumbel numbers. The method of probability weighted moments was seen to compare favourably with two other techniques. -Authors

  19. Risk Assessment Using the Three Dimensions of Probability (Likelihood), Severity, and Level of Control

    NASA Technical Reports Server (NTRS)

    Watson, Clifford

    2010-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the twodimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and threedimensional charting gives a visual confirmation of the relationship between causes and their controls

  20. Risk Assessment Using the Three Dimensions of Probability (Likelihood), Severity, and Level of Control

    NASA Technical Reports Server (NTRS)

    Watson, Clifford C.

    2011-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.

  1. The over-estimation of risk in pregnancy.

    PubMed

    Robinson, Monique; Pennell, Craig E; McLean, Neil J; Oddy, Wendy H; Newnham, John P

    2011-06-01

    The concept of risk is especially salient to obstetric care. Unknown factors can still be responsible for peri-natal morbidity and mortality in circumstances that appeared to present little risk, while perfectly healthy infants are born in high-risk circumstances: a contradiction that patients and providers struggle with on a daily basis. With such contradictions comes the potential for the over-estimation of risk during pregnancy in order to assure a positive outcome. Understanding and addressing the estimation of risk during pregnancy requires acknowledging the history of obstetric risk in addition to understanding risk-related psychological theory. A relationship of trust between provider and patient is vital in addressing risk over-estimation, as is encouraging the development of self-efficacy in patients. Ultimately obstetric care is complex and efforts to avoid pre-natal risk exposure based on heightened perceptions of threat may do more harm than the perceived threat itself. PMID:21480770

  2. Statistics of Natural Populations. II. Estimating an Allele Probability in Families Descended from Cryptic Mothers

    PubMed Central

    Arnold, Jonathan; Morrison, Melvin L.

    1985-01-01

    In population studies, adults are frequently difficult or inconvenient to identify for genotype, but a family profile of genotypes can be obtained from an unidentified female crossed with a single unidentified male. The problem is to estimate an allele frequency in the cryptic parental gene pool from the observed family profiles. For example, a worker may wish to estimate inversion frequencies in Drosophila; inversion karyotypes are cryptic in adults but visible in salivary gland squashes from larvae. A simple mixture model, which assumes the Hardy-Weinberg law, Mendelian laws and a single randomly chosen mate per female, provides the vehicle for studying three competing estimators of an allele frequency. A simple, heuristically appealing estimator called the Dobzhansky estimator is compared with the maximum likelihood estimator and a close relative called the grouped profiles estimator. The Dobzhansky estimator is computationally simple, consistent and highly efficient and is recommended in practice over its competitors. PMID:17246258

  3. Objective estimates improve risk stratification for primary graft dysfunction after lung transplantation

    PubMed Central

    Shah, Rupal J.; Diamond, Joshua M.; Cantu, Edward; Flesch, Judd; Lee, James C.; Lederer, David J.; Lama, Vibha N.; Orens, Jonathon; Weinacker, Ann; Wilkes, David S.; Roe, David; Bhorade, Sangeeta; Wille, Keith M.; Ware, Lorraine B.; Palmer, Scott M.; Crespo, Maria; Demissie, Ejigayehu; Sonnet, Joshua; Shah, Ashish; Kawut, Steven M.; Bellamy, Scarlett L.; Localio, A. Russell; Christie, Jason D.

    2016-01-01

    Primary graft dysfunction (PGD) is a major cause of early mortality after lung transplant. We aimed to define objective estimates of PGD risk based on readily available clinical variables, using a prospective study of 11 centers in Lung Transplant Outcomes Group (LTOG). Derivation included 1255 subjects from 2002–2010; with separate validation in 382 subjects accrued from 2011–2012. We used logistic regression to identify predictors of grade 3 PGD at 48/72 hours, and decision curve methods to assess impact on clinical decisions. 211/1255 subjects in the derivation and 56/382 subjects in the validation developed PGD. We developed 3 prediction models, where low-risk recipients had a normal BMI (18.5–25 kg/m2), COPD/CF, and absent or mild PH (mPAP< 40mmHg). All others were considered higher-risk. Low-risk recipients had a predicted PGD risk of 4–7%, and high-risk a predicted PGD risk of 15–18%. Adding a donor-smoking lung to a higher-risk recipient significantly increased PGD risk, although risk did not change in low-risk recipients. Validation demonstrated that probability estimates were generally accurate and that models worked best at baseline PGD incidences between 5–25%. We conclude that valid estimates of PGD risk can be produced using readily-available clinical variables. PMID:25877792

  4. Estimated radiation risks associated with endodontic radiography.

    PubMed

    Danforth, R A; Torabinejad, M

    1990-02-01

    Endodontic patients are sometimes concerned about the risks of tumors or cataracts from radiation exposure during root canal therapy. By using established dose and risk information, we calculated the extent of these risks. The chance of getting leukemia from an endodontic x-ray survey using 90 kVp was found to be 1 in 7.69 million, the same as the risk of dying from cancer from smoking 0.94 cigarettes or from an auto accident when driving 3.7 km. Risk of thyroid gland neoplasia was 1 in 667,000 (smoking 11.6 cigarettes, driving 45 km) and risk of salivary gland neoplasia 1 in 1.35 million (smoking 5.4 cigarettes, driving 21.1 km). Use of 70 kVp radiography reduced these risks only slightly. To receive the threshold dose to eyes to produce cataract changes, a patient would have to undergo 10,900 endodontic surveys. PMID:2390963

  5. Risk-taking in disorders of natural and drug rewards: neural correlates and effects of probability, valence, and magnitude.

    PubMed

    Voon, Valerie; Morris, Laurel S; Irvine, Michael A; Ruck, Christian; Worbe, Yulia; Derbyshire, Katherine; Rankov, Vladan; Schreiber, Liana Rn; Odlaug, Brian L; Harrison, Neil A; Wood, Jonathan; Robbins, Trevor W; Bullmore, Edward T; Grant, Jon E

    2015-03-01

    Pathological behaviors toward drugs and food rewards have underlying commonalities. Risk-taking has a fourfold pattern varying as a function of probability and valence leading to the nonlinearity of probability weighting with overweighting of small probabilities and underweighting of large probabilities. Here we assess these influences on risk-taking in patients with pathological behaviors toward drug and food rewards and examine structural neural correlates of nonlinearity of probability weighting in healthy volunteers. In the anticipation of rewards, subjects with binge eating disorder show greater risk-taking, similar to substance-use disorders. Methamphetamine-dependent subjects had greater nonlinearity of probability weighting along with impaired subjective discrimination of probability and reward magnitude. Ex-smokers also had lower risk-taking to rewards compared with non-smokers. In the anticipation of losses, obesity without binge eating had a similar pattern to other substance-use disorders. Obese subjects with binge eating also have impaired discrimination of subjective value similar to that of the methamphetamine-dependent subjects. Nonlinearity of probability weighting was associated with lower gray matter volume in dorsolateral and ventromedial prefrontal cortex and orbitofrontal cortex in healthy volunteers. Our findings support a distinct subtype of binge eating disorder in obesity with similarities in risk-taking in the reward domain to substance use disorders. The results dovetail with the current approach of defining mechanistically based dimensional approaches rather than categorical approaches to psychiatric disorders. The relationship to risk probability and valence may underlie the propensity toward pathological behaviors toward different types of rewards. PMID:25270821

  6. Risk-Taking in Disorders of Natural and Drug Rewards: Neural Correlates and Effects of Probability, Valence, and Magnitude

    PubMed Central

    Voon, Valerie; Morris, Laurel S; Irvine, Michael A; Ruck, Christian; Worbe, Yulia; Derbyshire, Katherine; Rankov, Vladan; Schreiber, Liana RN; Odlaug, Brian L; Harrison, Neil A; Wood, Jonathan; Robbins, Trevor W; Bullmore, Edward T; Grant, Jon E

    2015-01-01

    Pathological behaviors toward drugs and food rewards have underlying commonalities. Risk-taking has a fourfold pattern varying as a function of probability and valence leading to the nonlinearity of probability weighting with overweighting of small probabilities and underweighting of large probabilities. Here we assess these influences on risk-taking in patients with pathological behaviors toward drug and food rewards and examine structural neural correlates of nonlinearity of probability weighting in healthy volunteers. In the anticipation of rewards, subjects with binge eating disorder show greater risk-taking, similar to substance-use disorders. Methamphetamine-dependent subjects had greater nonlinearity of probability weighting along with impaired subjective discrimination of probability and reward magnitude. Ex-smokers also had lower risk-taking to rewards compared with non-smokers. In the anticipation of losses, obesity without binge eating had a similar pattern to other substance-use disorders. Obese subjects with binge eating also have impaired discrimination of subjective value similar to that of the methamphetamine-dependent subjects. Nonlinearity of probability weighting was associated with lower gray matter volume in dorsolateral and ventromedial prefrontal cortex and orbitofrontal cortex in healthy volunteers. Our findings support a distinct subtype of binge eating disorder in obesity with similarities in risk-taking in the reward domain to substance use disorders. The results dovetail with the current approach of defining mechanistically based dimensional approaches rather than categorical approaches to psychiatric disorders. The relationship to risk probability and valence may underlie the propensity toward pathological behaviors toward different types of rewards. PMID:25270821

  7. A model selection algorithm for a posteriori probability estimation with neural networks.

    PubMed

    Arribas, Juan Ignacio; Cid-Sueiro, Jesús

    2005-07-01

    This paper proposes a novel algorithm to jointly determine the structure and the parameters of a posteriori probability model based on neural networks (NNs). It makes use of well-known ideas of pruning, splitting, and merging neural components and takes advantage of the probabilistic interpretation of these components. The algorithm, so called a posteriori probability model selection (PPMS), is applied to an NN architecture called the generalized softmax perceptron (GSP) whose outputs can be understood as probabilities although results shown can be extended to more general network architectures. Learning rules are derived from the application of the expectation-maximization algorithm to the GSP-PPMS structure. Simulation results show the advantages of the proposed algorithm with respect to other schemes. PMID:16121722

  8. Combining earthquakes and GPS data to estimate the probability of future earthquakes with magnitude Mw ≥ 6.0

    NASA Astrophysics Data System (ADS)

    Chen, K.-P.; Tsai, Y.-B.; Chang, W.-Y.

    2013-10-01

    According to Wyss et al. (2000) result indicates that future main earthquakes can be expected along zones characterized by low b values. In this study we combine Benioff strain with global positioning system (GPS) data to estimate the probability of future Mw ≥ 6.0 earthquakes for a grid covering Taiwan. An approach similar to the maximum likelihood method was used to estimate Gutenberg-Richter parameters a and b. The two parameters were then used to estimate the probability of simulating future earthquakes of Mw ≥ 6.0 for each of the 391 grids (grid interval = 0.1°) covering Taiwan. The method shows a high probability of earthquakes in western Taiwan along a zone that extends from Taichung southward to Nantou, Chiayi, Tainan and Kaohsiung. In eastern Taiwan, there also exists a high probability zone from Ilan southward to Hualian and Taitung. These zones are characterized by high earthquake entropy, high maximum shear strain rates, and paths of low b values. A relation between entropy and maximum shear strain rate is also obtained. It indicates that the maximum shear strain rate is about 4.0 times the entropy. The results of this study should be of interest to city planners, especially those concerned with earthquake preparedness. And providing the earthquake insurers to draw up the basic premium.

  9. How does new evidence change our estimates of probabilities? Carnap's formula revisited

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik; Quintana, Chris

    1992-01-01

    The formula originally proposed by R. Carnap in his analysis of induction is reviewed and its natural generalization is presented. A situation is considered where the probability of a certain event is determined without using standard statistical methods due to the lack of observation.

  10. Comparing the probability of stroke by the Framingham risk score in hypertensive Korean patients visiting private clinics and tertiary hospitals

    PubMed Central

    2010-01-01

    Background The purpose of this study was to investigate the pattern of distribution of risk factors for stroke and the 10-year probability of stroke by the Framingham risk score in hypertensive patients visiting private clinics vs. tertiary hospitals. Methods A total of 2,490 hypertensive patients who attended 61 private clinics (1088 patients) and 37 tertiary hospitals (1402 patients) were enrolled. The risk factors for stroke were evaluated using a series of laboratory tests and physical examinations, and the 10-year probability of stroke was determined by applying the Framingham stroke risk equation. Results The proportion of patients who had uncontrolled hypertension despite the use of antihypertensive agents was 49% (66 and 36% of patients cared for at private clinics and tertiary hospitals, respectively; p < 0.001). The average 10-year probability of stroke by the Framingham risk score in hypertensive patients was 21% (approximately 2.2 times higher than of the risk of stroke in the Korean Cancer Prevention Study [KCPS] cohort) and was higher in patients attending tertiary hospitals compared to private clinics (16 and 24% of patients attending private clinics and tertiary hospitals, respectively; p < 0.001). Conclusions Since the 10-year probability of stroke by the Framingham risk score in hypertensive patients attending tertiary hospitals was higher than the risk for patients attending private clinics. We suggest that the more aggressive interventions are needed to prevent and early detect an attack of stroke in hypertensive patients attending tertiary hospitals. PMID:20822544

  11. Model approach to estimate the probability of accepting a lot of heterogeneously contaminated powdered food using different sampling strategies.

    PubMed

    Valero, Antonio; Pasquali, Frédérique; De Cesare, Alessandra; Manfreda, Gerardo

    2014-08-01

    Current sampling plans assume a random distribution of microorganisms in food. However, food-borne pathogens are estimated to be heterogeneously distributed in powdered foods. This spatial distribution together with very low level of contaminations raises concern of the efficiency of current sampling plans for the detection of food-borne pathogens like Cronobacter and Salmonella in powdered foods such as powdered infant formula or powdered eggs. An alternative approach based on a Poisson distribution of the contaminated part of the lot (Habraken approach) was used in order to evaluate the probability of falsely accepting a contaminated lot of powdered food when different sampling strategies were simulated considering variables such as lot size, sample size, microbial concentration in the contaminated part of the lot and proportion of contaminated lot. The simulated results suggest that a sample size of 100g or more corresponds to the lower number of samples to be tested in comparison with sample sizes of 10 or 1g. Moreover, the number of samples to be tested greatly decrease if the microbial concentration is 1CFU/g instead of 0.1CFU/g or if the proportion of contamination is 0.05 instead of 0.01. Mean contaminations higher than 1CFU/g or proportions higher than 0.05 did not impact on the number of samples. The Habraken approach represents a useful tool for risk management in order to design a fit-for-purpose sampling plan for the detection of low levels of food-borne pathogens in heterogeneously contaminated powdered food. However, it must be outlined that although effective in detecting pathogens, these sampling plans are difficult to be applied since the huge number of samples that needs to be tested. Sampling does not seem an effective measure to control pathogens in powdered food. PMID:24462218

  12. Accurate Estimation of the Entropy of Rotation-Translation Probability Distributions.

    PubMed

    Fogolari, Federico; Dongmo Foumthuim, Cedrix Jurgal; Fortuna, Sara; Soler, Miguel Angel; Corazza, Alessandra; Esposito, Gennaro

    2016-01-12

    The estimation of rotational and translational entropies in the context of ligand binding has been the subject of long-time investigations. The high dimensionality (six) of the problem and the limited amount of sampling often prevent the required resolution to provide accurate estimates by the histogram method. Recently, the nearest-neighbor distance method has been applied to the problem, but the solutions provided either address rotation and translation separately, therefore lacking correlations, or use a heuristic approach. Here we address rotational-translational entropy estimation in the context of nearest-neighbor-based entropy estimation, solve the problem numerically, and provide an exact and an approximate method to estimate the full rotational-translational entropy. PMID:26605696

  13. Probabilities and statistics for backscatter estimates obtained by a scatterometer with applications to new scatterometer design data

    NASA Technical Reports Server (NTRS)

    Pierson, Willard J., Jr.

    1989-01-01

    The values of the Normalized Radar Backscattering Cross Section (NRCS), sigma (o), obtained by a scatterometer are random variables whose variance is a known function of the expected value. The probability density function can be obtained from the normal distribution. Models for the expected value obtain it as a function of the properties of the waves on the ocean and the winds that generated the waves. Point estimates of the expected value were found from various statistics given the parameters that define the probability density function for each value. Random intervals were derived with a preassigned probability of containing that value. A statistical test to determine whether or not successive values of sigma (o) are truly independent was derived. The maximum likelihood estimates for wind speed and direction were found, given a model for backscatter as a function of the properties of the waves on the ocean. These estimates are biased as a result of the terms in the equation that involve natural logarithms, and calculations of the point estimates of the maximum likelihood values are used to show that the contributions of the logarithmic terms are negligible and that the terms can be omitted.

  14. Use of portable antennas to estimate abundance of PIT-tagged fish in small streams: Factors affecting detection probability

    USGS Publications Warehouse

    O'Donnell, Matthew J.; Horton, Gregg E.; Letcher, Benjamin H.

    2010-01-01

    Portable passive integrated transponder (PIT) tag antenna systems can be valuable in providing reliable estimates of the abundance of tagged Atlantic salmon Salmo salar in small streams under a wide range of conditions. We developed and employed PIT tag antenna wand techniques in two controlled experiments and an additional case study to examine the factors that influenced our ability to estimate population size. We used Pollock's robust-design capture–mark–recapture model to obtain estimates of the probability of first detection (p), the probability of redetection (c), and abundance (N) in the two controlled experiments. First, we conducted an experiment in which tags were hidden in fixed locations. Although p and c varied among the three observers and among the three passes that each observer conducted, the estimates of N were identical to the true values and did not vary among observers. In the second experiment using free-swimming tagged fish, p and c varied among passes and time of day. Additionally, estimates of N varied between day and night and among age-classes but were within 10% of the true population size. In the case study, we used the Cormack–Jolly–Seber model to examine the variation in p, and we compared counts of tagged fish found with the antenna wand with counts collected via electrofishing. In that study, we found that although p varied for age-classes, sample dates, and time of day, antenna and electrofishing estimates of N were similar, indicating that population size can be reliably estimated via PIT tag antenna wands. However, factors such as the observer, time of day, age of fish, and stream discharge can influence the initial and subsequent detection probabilities.

  15. Comparing Two Different Methods to Evaluate Convariance-Matrix of Debris Orbit State in Collision Probability Estimation

    NASA Astrophysics Data System (ADS)

    Cheng, Haowen; Liu, Jing; Xu, Yang

    The evaluation of convariance-matrix is an inevitable step when estimating collision probability based on the theory. Generally, there are two different methods to compute convariance-matrix. One is so-called Tracking-Delta-Fitting method, first introduced when estimating the collision probability using TLE catalogue data, in which convariance-matrix is evaluated by fitting series of differences between propagated orbits of formal data and updated orbit data. In the second method, convariance-matrix is evaluated in the process of orbit determination. Both of the methods has there difficulties when introduced in collision probability estimation. In the first method, the value of convariance-matrix is evaluated based only on historical orbit data, ignoring information of latest orbit determination. As a result, the accuracy of the method strongly depends on the stability of convariance-matrix of latest updated orbit. In the second method, the evaluation of convariance-matrix is acceptable when the determined orbit satisfies weighted-least-square estimation, depending on the accuracy of observation error convariance, which is hard to obtain in real application, evaluated by analyzing the residuals of orbit determination in our research. In this paper we provided numerical tests to compare these two methods. A simulation of cataloguing objects in LEO, MEO and GEO regions has been carried out for a time span of 3 months. The influence of orbit maneuver has been included in GEO objects cataloguing simulation. For LEO objects cataloguing, the effect of atmospheric density variation has also been considered. At the end of the paper accuracies of evaluated convariance-matrix and estimated collision probability have been tested and compared.

  16. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes

    NASA Astrophysics Data System (ADS)

    de Gregorio, Sofia; Camarda, Marco

    2016-07-01

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years.

  17. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes

    PubMed Central

    De Gregorio, Sofia; Camarda, Marco

    2016-01-01

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years. PMID:27456812

  18. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes.

    PubMed

    De Gregorio, Sofia; Camarda, Marco

    2016-01-01

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years. PMID:27456812

  19. Estimating the probability distribution of von Mises stress for structures undergoing random excitation. Part 1: Derivation

    SciTech Connect

    Segalman, D.; Reese, G.

    1998-09-01

    The von Mises stress is often used as the metric for evaluating design margins, particularly for structures made of ductile materials. For deterministic loads, both static and dynamic, the calculation of von Mises stress is straightforward, as is the resulting calculation of reliability. For loads modeled as random processes, the task is different; the response to such loads is itself a random process and its properties must be determined in terms of those of both the loads and the system. This has been done in the past by Monte Carlo sampling of numerical realizations that reproduce the second order statistics of the problem. Here, the authors present a method that provides analytic expressions for the probability distributions of von Mises stress which can be evaluated efficiently and with good precision numerically. Further, this new approach has the important advantage of providing the asymptotic properties of the probability distribution.

  20. Preoperative Evaluation: Estimation of Pulmonary Risk.

    PubMed

    Lakshminarasimhachar, Anand; Smetana, Gerald W

    2016-03-01

    Postoperative pulmonary complications (PPCs) are common after major non-thoracic surgery and associated with significant morbidity and high cost of care. A number of risk factors are strong predictors of PPCs. The overall goal of the preoperative pulmonary evaluation is to identify these potential, patient and procedure-related risks and optimize the health of the patients before surgery. A thorough clinical examination supported by appropriate laboratory tests will help guide the clinician to provide optimal perioperative care. PMID:26927740

  1. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Part 2, Human Error Probability (HEP) estimates: Data manual

    SciTech Connect

    Gertman, D.I.; Gilbert, B.G.; Gilmore, W.E.; Galyean, W.J.

    1988-06-01

    This volume of a five-volume series summarizes those data currently resident in the first releases of the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) data base. The raw human error probability (HEP) contained herein is accompanied by a glossary of terms and the HEP and hardware taxonomies used to structure the data. Instructions are presented on how the user may navigate through the NUCLARR data management system to find anchor values to assist in solving risk-related problems.

  2. INCLUDING TRANSITION PROBABILITIES IN NEST SURVIVAL ESTIMATION: A MAYFIELD MARKOV CHAIN

    EPA Science Inventory

    This manuscript is primarily an exploration of the statistical properties of nest-survival estimates for terrestrial songbirds. The Mayfield formulation described herein should allow researchers to test for complicated effects of stressors on daily survival and overall success, i...

  3. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    NASA Astrophysics Data System (ADS)

    Tan, Elcin

    A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the

  4. A H-infinity Fault Detection and Diagnosis Scheme for Discrete Nonlinear System Using Output Probability Density Estimation

    SciTech Connect

    Zhang Yumin; Lum, Kai-Yew; Wang Qingguo

    2009-03-05

    In this paper, a H-infinity fault detection and diagnosis (FDD) scheme for a class of discrete nonlinear system fault using output probability density estimation is presented. Unlike classical FDD problems, the measured output of the system is viewed as a stochastic process and its square root probability density function (PDF) is modeled with B-spline functions, which leads to a deterministic space-time dynamic model including nonlinearities, uncertainties. A weighting mean value is given as an integral function of the square root PDF along space direction, which leads a function only about time and can be used to construct residual signal. Thus, the classical nonlinear filter approach can be used to detect and diagnose the fault in system. A feasible detection criterion is obtained at first, and a new H-infinity adaptive fault diagnosis algorithm is further investigated to estimate the fault. Simulation example is given to demonstrate the effectiveness of the proposed approaches.

  5. A Method to Estimate the Probability that Any Individual Cloud-to-Ground Lightning Stroke was Within Any Radius of Any Point

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa; Roeder, WIlliam P.; Merceret, Francis J.

    2011-01-01

    A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station. Future applications could include forensic meteorology.

  6. Estimation of the Optimal Statistical Quality Control Sampling Time Intervals Using a Residual Risk Measure

    PubMed Central

    Hatjimihail, Aristides T.

    2009-01-01

    Background An open problem in clinical chemistry is the estimation of the optimal sampling time intervals for the application of statistical quality control (QC) procedures that are based on the measurement of control materials. This is a probabilistic risk assessment problem that requires reliability analysis of the analytical system, and the estimation of the risk caused by the measurement error. Methodology/Principal Findings Assuming that the states of the analytical system are the reliability state, the maintenance state, the critical-failure modes and their combinations, we can define risk functions based on the mean time of the states, their measurement error and the medically acceptable measurement error. Consequently, a residual risk measure rr can be defined for each sampling time interval. The rr depends on the state probability vectors of the analytical system, the state transition probability matrices before and after each application of the QC procedure and the state mean time matrices. As optimal sampling time intervals can be defined those minimizing a QC related cost measure while the rr is acceptable. I developed an algorithm that estimates the rr for any QC sampling time interval of a QC procedure applied to analytical systems with an arbitrary number of critical-failure modes, assuming any failure time and measurement error probability density function for each mode. Furthermore, given the acceptable rr, it can estimate the optimal QC sampling time intervals. Conclusions/Significance It is possible to rationally estimate the optimal QC sampling time intervals of an analytical system to sustain an acceptable residual risk with the minimum QC related cost. For the optimization the reliability analysis of the analytical system and the risk analysis of the measurement error are needed. PMID:19513124

  7. Parametric Estimation in a Recurrent Competing Risks Model

    PubMed Central

    Peña, Edsel A.

    2014-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators. PMID:25346751

  8. Estimated soil ingestion rates for use in risk assessment

    SciTech Connect

    LaGoy, P.K.

    1987-09-01

    Assessing the risks to human health posed by contaminants present in soil requires an estimate of likely soil ingestion rates. In the past, direct measurements of soil ingestion were not available and risk assessors were forced to estimate soil ingestion rates based on observations of mouthing behavior and measurements of soil on hands. Recently, empirical data on soil ingestion rates have become available from two sources. Although preliminary, these data can be used to derive better estimates of soil ingestion rates for use in risk assessments. Estimates of average soil ingestion rates derived in this paper range from 25 to 100 mg/day, depending on the age of the individual at risk. Maximum soil ingestion rates that are unlikely to underestimate exposure range from 100 to 500 mg. A value of 5000 mg/day is considered a reasonable estimate of a maximum single-day exposure for a child with habitual pica. 12 references.

  9. Known and probable risk factors for hepatitis C infection: A case series in north-eastern Poland

    PubMed Central

    Chlabicz, Sławomir; Flisiak, Robert; Grzeszczuk, Anna; Kovalchuk, Oksana; Prokopowicz, Danuta; Chyczewski, Lech

    2006-01-01

    AIM: To describe the risk profile of patients in hospital with hepatitis C virus (HCV) infection in Poland. METHOD: Using a structured questionnaire, all patients with confirmed HCV infection were interviewed about the risk factors. RESULTS: Among the 250 patients studied, transfusion before 1993 was the primary risk factor in 26%, intravenous drug use setting in 9% and occupational exposure in health-care in 9%. Women were more likely to have a history of occupational exposure or transfusion before 1993 and less likely to undergo minor surgery. Known nosocomial risk factors (transfusion before 1993, dialysis) were responsible for 27% of infections, probable nosocomial factors (transfusions after 1992, minor surgery) for 14% and further 9% were occupationally acquired infections. CONCLUSION: A careful history investigation can identify a known or probable risk factor for HCV acquisition in 59% of patients with HCV infection. Preventive activities in Poland should focus on infection control measures in health-care setting. PMID:16440435

  10. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Giannakis, G. V.; Lilli, M. A.; Ioannidou, E.; Nikolaidis, N. P.; Karatzas, G. P.

    2015-06-01

    Riverbank erosion affects river morphology and local habitat and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict vulnerable to erosion areas is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a combined deterministic and statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the vulnerable to erosion locations by quantifying the potential eroded area. The derived results are used to determine validation locations for the statistical tool performance evaluation. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed methodology is easy to use, accurate and can be applied to any region and river.

  11. Development of a statistical tool for the estimation of riverbank erosion probability

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Giannakis, G. V.; Lilli, M. A.; Ioannidou, E.; Nikolaidis, N. P.; Karatzas, G. P.

    2016-01-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a statistical methodology is proposed to predict the probability of the presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the logistic regression methodology. It is developed in two forms, logistic regression and locally weighted logistic regression, which both deliver useful and accurate results. The second form, though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use and accurate and can be applied to any region and river.

  12. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  13. EVALUATING PROBABILITY SAMPLING STRATEGIES FOR ESTIMATING REDD COUNTS: AN EXAMPLE WITH CHINOOK SALMON (Oncorhynchus tshawytscha)

    EPA Science Inventory

    Precise, unbiased estimates of population size are an essential tool for fisheries management. For a wide variety of salmonid fishes, redd counts from a sample of reaches are commonly used to monitor annual trends in abundance. Using a 9-year time series of georeferenced censuses...

  14. [Epidemiological data and radiation risk estimates].

    PubMed

    Cardis, E

    2002-01-01

    The results of several major epidemiology studies on populations with particular exposure to ionizing radiation should become available during the first years of the 21(st) century. These studies are expected to provide answers to a number of questions concerning public health and radiation protection. Most of the populations concerned were accidentally exposed to radiation in ex-USSR or elsewhere or in a nuclear industrial context. The results will complete and test information on risk coming from studies among survivors of the Hiroshima and Nagasaki atomic bombs, particularly studies on the effects of low dose exposure and prolonged low-dose exposure, of different types of radiation, and environmental and host-related factors which could modify the risk of radiation-induced effects. These studies are thus important to assess the currently accepted scientific evidence on radiation protection for workers and the general population. In addition, supplementary information on radiation protection could be provided by formal comparisons and analyses combining data from populations with different types of exposure. Finally, in order to provide pertinent information for public health and radiation protection, future epidemiology studies should be targeted and designed to answer specific questions, concerning, for example, the risk for specific populations (children, patients, people with genetic predisposition). An integrated approach, combining epidemiology and studies on the mechanisms of radiation induction should provide particularly pertinent information. PMID:11938114

  15. Estimation of the probability of exposure to metalworking fluids in a population-based case-control study

    PubMed Central

    Park, Dong-Uk; Colt, Joanne S.; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R.; Armenti, Karla R.; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe here an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (10-90%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally, 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and the US production levels by decade found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. PMID:25256317

  16. Soil-ecological risks for soil degradation estimation

    NASA Astrophysics Data System (ADS)

    Trifonova, Tatiana; Shirkin, Leonid; Kust, German; Andreeva, Olga

    2016-04-01

    Soil degradation includes the processes of soil properties and quality worsening, primarily from the point of view of their productivity and decrease of ecosystem services quality. Complete soil cover destruction and/or functioning termination of soil forms of organic life are considered as extreme stages of soil degradation, and for the fragile ecosystems they are normally considered in the network of their desertification, land degradation and droughts /DLDD/ concept. Block-model of ecotoxic effects, generating soil and ecosystem degradation, has been developed as a result of the long-term field and laboratory research of sod-podzol soils, contaminated with waste, containing heavy metals. The model highlights soil degradation mechanisms, caused by direct and indirect impact of ecotoxicants on "phytocenosis- soil" system and their combination, frequently causing synergistic effect. The sequence of occurring changes here can be formalized as a theory of change (succession of interrelated events). Several stages are distinguished here - from heavy metals leaching (releasing) in waste and their migration downward the soil profile to phytoproductivity decrease and certain phytocenosis composition changes. Phytoproductivity decrease leads to the reduction of cellulose content introduced into the soil. The described feedback mechanism acts as a factor of sod-podzolic soil self-purification and stability. It has been shown, that using phytomass productivity index, integrally reflecting the worsening of soil properties complex, it is possible to solve the problems dealing with the dose-reflecting reactions creation and determination of critical levels of load for phytocenosis and corresponding soil-ecological risks. Soil-ecological risk in "phytocenosis- soil" system means probable negative changes and the loss of some ecosystem functions during the transformation process of dead organic substance energy for the new biomass composition. Soil-ecological risks estimation is

  17. Uncertainties in Estimates of the Risks of Late Effects from Space Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.

    2002-01-01

    The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.

  18. Three-dimensional heart dose reconstruction to estimate normal tissue complication probability after breast irradiation using portal dosimetry

    SciTech Connect

    Louwe, R. J. W.; Wendling, M.; Herk, M. B. van; Mijnheer, B. J.

    2007-04-15

    Irradiation of the heart is one of the major concerns during radiotherapy of breast cancer. Three-dimensional (3D) treatment planning would therefore be useful but cannot always be performed for left-sided breast treatments, because CT data may not be available. However, even if 3D dose calculations are available and an estimate of the normal tissue damage can be made, uncertainties in patient positioning may significantly influence the heart dose during treatment. Therefore, 3D reconstruction of the actual heart dose during breast cancer treatment using electronic imaging portal device (EPID) dosimetry has been investigated. A previously described method to reconstruct the dose in the patient from treatment portal images at the radiological midsurface was used in combination with a simple geometrical model of the irradiated heart volume to enable calculation of dose-volume histograms (DVHs), to independently verify this aspect of the treatment without using 3D data from a planning CT scan. To investigate the accuracy of our method, the DVHs obtained with full 3D treatment planning system (TPS) calculations and those obtained after resampling the TPS dose in the radiological midsurface were compared for fifteen breast cancer patients for whom CT data were available. In addition, EPID dosimetry as well as 3D dose calculations using our TPS, film dosimetry, and ionization chamber measurements were performed in an anthropomorphic phantom. It was found that the dose reconstructed using EPID dosimetry and the dose calculated with the TPS agreed within 1.5% in the lung/heart region. The dose-volume histograms obtained with EPID dosimetry were used to estimate the normal tissue complication probability (NTCP) for late excess cardiac mortality. Although the accuracy of these NTCP calculations might be limited due to the uncertainty in the NTCP model, in combination with our portal dosimetry approach it allows incorporation of the actual heart dose. For the anthropomorphic

  19. A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

    1999-01-01

    A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, μ, and the aperiodicity of the mean, α (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of α = 0.5. For this value of α, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > μ⁄2, and is ~ ~ 2 ⁄ μ for all times > μ. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

  20. Student Estimates of Probability and Uncertainty in Advanced Laboratory and Statistical Physics Courses

    NASA Astrophysics Data System (ADS)

    Mountcastle, Donald B.; Bucy, Brandon R.; Thompson, John R.

    2007-11-01

    Equilibrium properties of macroscopic systems are highly predictable as n, the number of particles approaches and exceeds Avogadro's number; theories of statistical physics depend on these results. Typical pedagogical devices used in statistical physics textbooks to introduce entropy (S) and multiplicity (ω) (where S = k ln(ω)) include flipping coins and/or other equivalent binary events, repeated n times. Prior to instruction, our statistical mechanics students usually gave reasonable answers about the probabilities, but not the relative uncertainties, of the predicted outcomes of such events. However, they reliably predicted that the uncertainty in a measured continuous quantity (e.g., the amount of rainfall) does decrease as the number of measurements increases. Typical textbook presentations assume that students understand that the relative uncertainty of binary outcomes will similarly decrease as the number of events increases. This is at odds with our findings, even though most of our students had previously completed mathematics courses in statistics, as well as an advanced electronics laboratory course that included statistical analysis of distributions of dart scores as n increased.

  1. Estimating the probability of demonstrating vaccine efficacy in the declining Ebola epidemic: a Bayesian modelling approach

    PubMed Central

    Funk, Sebastian; Watson, Conall H; Kucharski, Adam J; Edmunds, W John

    2015-01-01

    Objectives We investigate the chance of demonstrating Ebola vaccine efficacy in an individually randomised controlled trial implemented in the declining epidemic of Forécariah prefecture, Guinea. Methods We extend a previously published dynamic transmission model to include a simulated individually randomised controlled trial of 100 000 participants. Using Bayesian methods, we fit the model to Ebola case incidence before a trial and forecast the expected dynamics until disease elimination. We simulate trials under these forecasts and test potential start dates and rollout schemes to assess power to detect efficacy, and bias in vaccine efficacy estimates that may be introduced. Results Under realistic assumptions, we found that a trial of 100 000 participants starting after 1 August had less than 5% chance of having enough cases to detect vaccine efficacy. In particular, gradual recruitment precludes detection of vaccine efficacy because the epidemic is likely to go extinct before enough participants are recruited. Exclusion of early cases in either arm of the trial creates bias in vaccine efficacy estimates. Conclusions The very low Ebola virus disease incidence in Forécariah prefecture means any individually randomised controlled trial implemented there is unlikely to be successful, unless there is a substantial increase in the number of cases. PMID:26671958

  2. Probabilistic methodology for estimating radiation-induced cancer risk

    SciTech Connect

    Dunning, D.E. Jr.; Leggett, R.W.; Williams, L.R.

    1981-01-01

    The RICRAC computer code was developed at Oak Ridge National Laboratory to provide a versatile and convenient methodology for radiation risk assessment. The code allows as input essentially any dose pattern commonly encountered in risk assessments for either acute or chronic exposures, and it includes consideration of the age structure of the exposed population. Results produced by the analysis include the probability of one or more radiation-induced cancer deaths in a specified population, expected numbers of deaths, and expected years of life lost as a result of premature fatalities. These calculatons include consideration of competing risks of death from all other causes. The program also generates a probability frequency distribution of the expected number of cancers in any specified cohort resulting from a given radiation dose. The methods may be applied to any specified population and dose scenario.

  3. A simple procedure for estimating pseudo risk ratios from exposure to non-carcinogenic chemical mixtures.

    PubMed

    Scinicariello, Franco; Portier, Christopher

    2016-03-01

    Non-cancer risk assessment traditionally assumes a threshold of effect, below which there is a negligible risk of an adverse effect. The Agency for Toxic Substances and Disease Registry derives health-based guidance values known as Minimal Risk Levels (MRLs) as estimates of the toxicity threshold for non-carcinogens. Although the definition of an MRL, as well as EPA reference dose values (RfD and RfC), is a level that corresponds to "negligible risk," they represent daily exposure doses or concentrations, not risks. We present a new approach to calculate the risk at exposure to specific doses for chemical mixtures, the assumption in this approach is to assign de minimis risk at the MRL. The assigned risk enables the estimation of parameters in an exponential model, providing a complete dose-response curve for each compound from the chosen point of departure to zero. We estimated parameters for 27 chemicals. The value of k, which determines the shape of the dose-response curve, was moderately insensitive to the choice of the risk at the MRL. The approach presented here allows for the calculation of a risk from a single substance or the combined risk from multiple chemical exposures in a community. The methodology is applicable from point of departure data derived from quantal data, such as data from benchmark dose analyses or from data that can be transformed into probabilities, such as lowest-observed-adverse-effect level. The individual risks are used to calculate risk ratios that can facilitate comparison and cost-benefit analyses of environmental contamination control strategies. PMID:25667015

  4. Uncertainties in estimates of the risks of late effects from space radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. Published by Elsevier Ltd on behalf of COSPAR.

  5. Uncertainties in estimates of the risks of late effects from space radiation.

    PubMed

    Cucinotta, F A; Schimmerling, W; Wilson, J W; Peterson, L E; Saganti, P B; Dicello, J F

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. PMID:15881779

  6. ITER- International Toxicity Estimates for Risk, new TOXNET database.

    PubMed

    Tomasulo, Patricia

    2005-01-01

    ITER, the International Toxicity Estimates for Risk database, joined the TOXNET system in the winter of 2004. ITER features international comparisons of environmental health risk assessment information and contains over 620 chemical records. ITER includes data from the EPA, Health Canada, the National Institute of Public Health and the Environment of the Netherlands, and other organizations that provide risk values that have been peer-reviewed. PMID:15760833

  7. Local Sea Level Changes: Assessing and Accounting for the Risk Associated With the Low-Probability, High-Risk Tail of the Risk Spectrum

    NASA Astrophysics Data System (ADS)

    Plag, H. P.

    2014-12-01

    Stakeholders in the coastal zone, particularly the urban coasts, are turning to science to get information on future Local Sea Level (LSL) rise. Many scientists and scientific committees respond to this request with a range of plausible trajectories (RPT) defined by a number of possible trajectories each corresponding to a certain scenario. Often these assessments take a starting point in the small number of global sea level trajectories provided by the IPCC. This approach is inherently deterministic. The resulting RPT, which can be quite large, is considered as reflecting "uncertainty in LSL projections." Non-scientists often use the RPT to select a preferred and much narrower sub-RPT, for which they plan, or they use the "large uncertainty" to justify not taking any measures. In response to societal needs, science focuses on a reduction of the uncertainties through improved deterministic models. This approach has a number of problems: (1) The complexity of LSL as the outcome of many local, regional and global earth system processes, including anthropogenic processes, renders a deterministic approach to prediction invalid. (2) Most assessments of the RPT account for an incomplete set of relevant earth system processes, and for each processes make assumptions that (often arbitrarily) constrain the contribution from this process. (3) LSL is an inherently probabilistic variable that has a broad probability density function (PDF), with a complex dependency of this PDF on the PDFs of the many contributing processes. In particular, the contribution from the large ice sheets has a PDF with low-probability high-impact tails that are generally neglected in deterministic LSL projections and in the sub-RPT used for coastal planning. A fully probabilistic assessment of the risk associated with LSL rise indicates that the standard deterministic assessment not only neglect most of the low-probability, high-risk tail of the PDF but also medium-probability, high-risk parts. This

  8. Probability Discounting of Gains and Losses: Implications for Risk Attitudes and Impulsivity

    ERIC Educational Resources Information Center

    Shead, N. Will; Hodgins, David C.

    2009-01-01

    Sixty college students performed three discounting tasks: probability discounting of gains, probability discounting of losses, and delay discounting of gains. Each task used an adjusting-amount procedure, and participants' choices affected the amount and timing of their remuneration for participating. Both group and individual discounting…

  9. Sensitivity of health risk estimates to air quality adjustment procedure

    SciTech Connect

    Whitfield, R.G.

    1997-06-30

    This letter is a summary of risk results associated with exposure estimates using two-parameter Weibull and quadratic air quality adjustment procedures (AQAPs). New exposure estimates were developed for children and child-occurrences, six urban areas, and five alternative air quality scenarios. In all cases, the Weibull and quadratic results are compared to previous results, which are based on a proportional AQAP.

  10. The Risk of Reduced Physical Activity in Children with Probable Developmental Coordination Disorder: A Prospective Longitudinal Study

    ERIC Educational Resources Information Center

    Green, Dido; Lingam, Raghu; Mattocks, Calum; Riddoch, Chris; Ness, Andy; Emond, Alan

    2011-01-01

    The aim of the current study was to test the hypothesis that children with probable Developmental Coordination Disorder have an increased risk of reduced moderate to vigorous physical activity (MVPA), using data from a large population based study. Prospectively collected data from 4331 children (boys = 2065, girls = 2266) who had completed motor…

  11. Overall risk estimation for nonreactor nuclear facilities and implementation of safety goals

    SciTech Connect

    Kim, K.S.; Bradley, R.F.

    1993-06-01

    A typical safety analysis report (SAR) contains estimated frequencies.and consequences of various design basis accidents (DBA). However, the results are organized and presented in such a way that they are not conducive for summing up with mathematical rigor to express total or overall risk. This paper describes a mathematical formalism for deriving total risk indicators. The mathematical formalism is based on the complementary cumulative distribution function (CCDF) or exceedance probability of radioactivity release fraction and individual radiation dose. A simple protocol is presented for establishing exceedance probabilities from the results of DBA analyses typically available from an SAR. The exceedance probability of release fraction can be a useful indicator for gaining insights into the capability of confinement barriers, characteristics of source terms, and scope of the SAR. Fatality risks comparable to the DOE Safety Goals can be derived from the exceedance probability of individual doses. Example case analyses are presented to illustrate the use of the proposed protocol and mathematical formalism. The methodology is finally applied to proposed risk guidelines for individual accident events to show that these guidelines would be within the DOE Safety Goals.

  12. A hierarchical model combining distance sampling and time removal to estimate detection probability during avian point counts

    USGS Publications Warehouse

    Amundson, Courtney L.; Royle, J. Andrew; Handel, Colleen M.

    2014-01-01

    Imperfect detection during animal surveys biases estimates of abundance and can lead to improper conclusions regarding distribution and population trends. Farnsworth et al. (2005) developed a combined distance-sampling and time-removal model for point-transect surveys that addresses both availability (the probability that an animal is available for detection; e.g., that a bird sings) and perceptibility (the probability that an observer detects an animal, given that it is available for detection). We developed a hierarchical extension of the combined model that provides an integrated analysis framework for a collection of survey points at which both distance from the observer and time of initial detection are recorded. Implemented in a Bayesian framework, this extension facilitates evaluating covariates on abundance and detection probability, incorporating excess zero counts (i.e. zero-inflation), accounting for spatial autocorrelation, and estimating population density. Species-specific characteristics, such as behavioral displays and territorial dispersion, may lead to different patterns of availability and perceptibility, which may, in turn, influence the performance of such hierarchical models. Therefore, we first test our proposed model using simulated data under different scenarios of availability and perceptibility. We then illustrate its performance with empirical point-transect data for a songbird that consistently produces loud, frequent, primarily auditory signals, the Golden-crowned Sparrow (Zonotrichia atricapilla); and for 2 ptarmigan species (Lagopus spp.) that produce more intermittent, subtle, and primarily visual cues. Data were collected by multiple observers along point transects across a broad landscape in southwest Alaska, so we evaluated point-level covariates on perceptibility (observer and habitat), availability (date within season and time of day), and abundance (habitat, elevation, and slope), and included a nested point

  13. On Modeling Human Leukocyte Antigen-Identical Sibling Match Probability for Allogeneic Hematopoietic Cell Transplantation: Estimating the Need for an Unrelated Donor Source.

    PubMed

    Besse, Kelsey; Maiers, Martin; Confer, Dennis; Albrecht, Mark

    2016-03-01

    Prior studies of allogeneic hematopoietic cell transplantation (HCT) therapy for the treatment of malignant or nonmalignant blood disorders assume a 30% likelihood that a patient will find a match among siblings and, therefore, a 70% likelihood of needing an unrelated donor source. This study utilizes birth data and statistical modeling to assess the adequacy of these estimates to describe the probability among US population cohorts segmented by race/ethnicity and age, including ages of greatest HCT utilization. Considerable variation in the likelihood of an HLA-identical sibling was found, ranging from 13% to 51%, depending upon patient age and race/ethnicity. Low sibling match probability, compounded with increased genetic diversity and lower availability among unrelated donors, put the youngest minority patients at the greatest risk for not finding a suitable related or unrelated HCT donor. Furthermore, the present 40-year decline in birth rates is expected to lead to 1.5-fold decrease in access to a matched sibling for today's young adults (18 to 44 years of age) when they reach peak HCT utilization years (near age 61 years) versus their contemporary adult counterparts (44 to 64 years). Understanding the sibling match probability by race/ethnicity and age cohort leads to forecasting the demand for unrelated HCT sources. PMID:26403513

  14. Sensitivity Analysis of Median Lifetime on Radiation Risks Estimates for Cancer and Circulatory Disease amongst Never-Smokers

    NASA Technical Reports Server (NTRS)

    Chappell, Lori J.; Cucinotta, Francis A.

    2011-01-01

    Radiation risks are estimated in a competing risk formalism where age or time after exposure estimates of increased risks for cancer and circulatory diseases are folded with a probability to survive to a given age. The survival function, also called the life-table, changes with calendar year, gender, smoking status and other demographic variables. An outstanding problem in risk estimation is the method of risk transfer between exposed populations and a second population where risks are to be estimated. Approaches used to transfer risks are based on: 1) Multiplicative risk transfer models -proportional to background disease rates. 2) Additive risk transfer model -risks independent of background rates. In addition, a Mixture model is often considered where the multiplicative and additive transfer assumptions are given weighted contributions. We studied the influence of the survival probability on the risk of exposure induced cancer and circulatory disease morbidity and mortality in the Multiplicative transfer model and the Mixture model. Risks for never-smokers (NS) compared to the average U.S. population are estimated to be reduced between 30% and 60% dependent on model assumptions. Lung cancer is the major contributor to the reduction for NS, with additional contributions from circulatory diseases and cancers of the stomach, liver, bladder, oral cavity, esophagus, colon, a portion of the solid cancer remainder, and leukemia. Greater improvements in risk estimates for NS s are possible, and would be dependent on improved understanding of risk transfer models, and elucidating the role of space radiation on the various stages of disease formation (e.g. initiation, promotion, and progression).

  15. Estimating the re-identification risk of clinical data sets

    PubMed Central

    2012-01-01

    Background De-identification is a common way to protect patient privacy when disclosing clinical data for secondary purposes, such as research. One type of attack that de-identification protects against is linking the disclosed patient data with public and semi-public registries. Uniqueness is a commonly used measure of re-identification risk under this attack. If uniqueness can be measured accurately then the risk from this kind of attack can be managed. In practice, it is often not possible to measure uniqueness directly, therefore it must be estimated. Methods We evaluated the accuracy of uniqueness estimators on clinically relevant data sets. Four candidate estimators were identified because they were evaluated in the past and found to have good accuracy or because they were new and not evaluated comparatively before: the Zayatz estimator, slide negative binomial estimator, Pitman’s estimator, and mu-argus. A Monte Carlo simulation was performed to evaluate the uniqueness estimators on six clinically relevant data sets. We varied the sampling fraction and the uniqueness in the population (the value being estimated). The median relative error and inter-quartile range of the uniqueness estimates was measured across 1000 runs. Results There was no single estimator that performed well across all of the conditions. We developed a decision rule which selected between the Pitman, slide negative binomial and Zayatz estimators depending on the sampling fraction and the difference between estimates. This decision rule had the best consistent median relative error across multiple conditions and data sets. Conclusion This study identified an accurate decision rule that can be used by health privacy researchers and disclosure control professionals to estimate uniqueness in clinical data sets. The decision rule provides a reliable way to measure re-identification risk. PMID:22776564

  16. [Application of spatial relative risk estimation in communicable disease risk evaluation].

    PubMed

    Zhang, Yewu; Guo, Qing; Wang, Xiaofeng; Yu, Meng; Su, Xuemei; Dong, Yan; Zhang, Chunxi

    2015-05-01

    This paper summaries the application of adaptive kernel density algorithm in the spatial relative risk estimation of communicable diseases by using the reported data of infectious diarrhea (other than cholera, dysentery, typhoid and paratyphoid) in Ludian county and surrounding area in Yunnan province in 2013. Statistically significant fluctuations in an estimated risk function were identified through the use of asymptotic tolerance contours, and finally these data were visualized though disease mapping. The results of spatial relative risk estimation and disease mapping showed that high risk areas were in southeastern Shaoyang next to Ludian. Therefore, the spatial relative risk estimation of disease by using adaptive kernel density algorithm and disease mapping technique is a powerful method in identifying high risk population and areas. PMID:26080648

  17. Spinodal Decomposition for the Cahn-Hilliard Equation in Higher Dimensions.Part I: Probability and Wavelength Estimate

    NASA Astrophysics Data System (ADS)

    Maier-Paape, Stanislaus; Wanner, Thomas

    This paper is the first in a series of two papers addressing the phenomenon of spinodal decomposition for the Cahn-Hilliard equation where , is a bounded domain with sufficiently smooth boundary, and f is cubic-like, for example f(u) =u-u3. We will present the main ideas of our approach and explain in what way our method differs from known results in one space dimension due to Grant [26]. Furthermore, we derive certain probability and wavelength estimates. The probability estimate is needed to understand why in a neighborhood of a homogeneous equilibrium u0≡μ of the Cahn-Hilliard equation, with mass μ in the spinodal region, a strongly unstable manifold has dominating effects. This is demonstrated for the linearized equation, but will be essential for the nonlinear setting in the second paper [37] as well. Moreover, we introduce the notion of a characteristic wavelength for the strongly unstable directions.

  18. Probable Post Traumatic Stress Disorder in Kenya and Its Associated Risk Factors: A Cross-Sectional Household Survey

    PubMed Central

    Jenkins, Rachel; Othieno, Caleb; Omollo, Raymond; Ongeri, Linnet; Sifuna, Peter; Mboroki, James Kingora; Kiima, David; Ogutu, Bernhards

    2015-01-01

    This study aimed to assess the prevalence of probable post-traumatic stress disorder (PTSD), and its associated risk factors in a general household population in Kenya. Data were drawn from a cross-sectional household survey of mental disorders and their associated risk factors. The participants received a structured epidemiological assessment of common mental disorders, and symptoms of PTSD, accompanied by additional sections on socio-demographic data, life events, social networks, social supports, disability/activities of daily living, quality of life, use of health services, and service use. The study found that 48% had experienced a severe trauma, and an overall prevalence rate of 10.6% of probable PTSD, defined as a score of six or more on the trauma screening questionnaire (TSQ). The conditional probability of PTSD was 0.26. Risk factors include being female, single, self-employed, having experienced recent life events, having a common mental disorder (CMD)and living in an institution before age 16. The study indicates that probable PTSD is prevalent in this rural area of Kenya. The findings are relevant for the training of front line health workers, their support and supervision, for health management information systems, and for mental health promotion in state boarding schools. PMID:26516877

  19. Probable Post Traumatic Stress Disorder in Kenya and Its Associated Risk Factors: A Cross-Sectional Household Survey.

    PubMed

    Jenkins, Rachel; Othieno, Caleb; Omollo, Raymond; Ongeri, Linnet; Sifuna, Peter; Mboroki, James Kingora; Kiima, David; Ogutu, Bernhards

    2015-10-01

    This study aimed to assess the prevalence of probable post-traumatic stress disorder (PTSD), and its associated risk factors in a general household population in Kenya. Data were drawn from a cross-sectional household survey of mental disorders and their associated risk factors. The participants received a structured epidemiological assessment of common mental disorders, and symptoms of PTSD, accompanied by additional sections on socio-demographic data, life events, social networks, social supports, disability/activities of daily living, quality of life, use of health services, and service use. The study found that 48% had experienced a severe trauma, and an overall prevalence rate of 10.6% of probable PTSD, defined as a score of six or more on the trauma screening questionnaire (TSQ). The conditional probability of PTSD was 0.26. Risk factors include being female, single, self-employed, having experienced recent life events, having a common mental disorder (CMD)and living in an institution before age 16. The study indicates that probable PTSD is prevalent in this rural area of Kenya. The findings are relevant for the training of front line health workers, their support and supervision, for health management information systems, and for mental health promotion in state boarding schools. PMID:26516877

  20. Estimation of reliability and dynamic property for polymeric material at high strain rate using SHPB technique and probability theory

    NASA Astrophysics Data System (ADS)

    Kim, Dong Hyeok; Lee, Ouk Sub; Kim, Hong Min; Choi, Hye Bin

    2008-11-01

    A modified Split Hopkinson Pressure Bar technique with aluminum pressure bars and a pulse shaper technique to achieve a closer impedance match between the pressure bars and the specimen materials such as hot temperature degraded POM (Poly Oxy Methylene) and PP (Poly Propylene). The more distinguishable experimental signals were obtained to evaluate the more accurate dynamic deformation behavior of materials under a high strain rate loading condition. A pulse shaping technique is introduced to reduce the non-equilibrium on the dynamic material response by modulation of the incident wave during a short period of test. This increases the rise time of the incident pulse in the SHPB experiment. For the dynamic stress strain curve obtained from SHPB experiment, the Johnson-Cook model is applied as a constitutive equation. The applicability of this constitutive equation is verified by using the probabilistic reliability estimation method. Two reliability methodologies such as the FORM and the SORM have been proposed. The limit state function(LSF) includes the Johnson-Cook model and applied stresses. The LSF in this study allows more statistical flexibility on the yield stress than a paper published before. It is found that the failure probability estimated by using the SORM is more reliable than those of the FORM/ It is also noted that the failure probability increases with increase of the applied stress. Moreover, it is also found that the parameters of Johnson-Cook model such as A and n, and the applied stress are found to affect the failure probability more severely than the other random variables according to the sensitivity analysis.

  1. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems

    NASA Astrophysics Data System (ADS)

    Vio, R.; Andreani, P.

    2016-05-01

    The reliable detection of weak signals is a critical issue in many astronomical contexts and may have severe consequences for determining number counts and luminosity functions, but also for optimizing the use of telescope time in follow-up observations. Because of its optimal properties, one of the most popular and widely-used detection technique is the matched filter (MF). This is a linear filter designed to maximise the detectability of a signal of known structure that is buried in additive Gaussian random noise. In this work we show that in the very common situation where the number and position of the searched signals within a data sequence (e.g. an emission line in a spectrum) or an image (e.g. a point-source in an interferometric map) are unknown, this technique, when applied in its standard form, may severely underestimate the probability of false detection. This is because the correct use of the MF relies upon a priori knowledge of the position of the signal of interest. In the absence of this information, the statistical significance of features that are actually noise is overestimated and detections claimed that are actually spurious. For this reason, we present an alternative method of computing the probability of false detection that is based on the probability density function (PDF) of the peaks of a random field. It is able to provide a correct estimate of the probability of false detection for the one-, two- and three-dimensional case. We apply this technique to a real two-dimensional interferometric map obtained with ALMA.

  2. Bayesian Estimates of Transition Probabilities in Seven Small Lithophytic Orchid Populations: Maximizing Data Availability from Many Small Samples

    PubMed Central

    Tremblay, Raymond L.; McCarthy, Michael A.

    2014-01-01

    Predicting population dynamics for rare species is of paramount importance in order to evaluate the likelihood of extinction and planning conservation strategies. However, evaluating and predicting population viability can be hindered from a lack of data. Rare species frequently have small populations, so estimates of vital rates are often very uncertain due to lack of data. We evaluated the vital rates of seven small populations from two watersheds with varying light environment of a common epiphytic orchid using Bayesian methods of parameter estimation. From the Lefkovitch matrices we predicted the deterministic population growth rates, elasticities, stable stage distributions and the credible intervals of the statistics. Populations were surveyed on a monthly basis between 18–34 months. In some of the populations few or no transitions in some of the vital rates were observed throughout the sampling period, however, we were able to predict the most likely vital rates using a Bayesian model that incorporated the transitions rates from the other populations. Asymptotic population growth rate varied among the seven orchid populations. There was little difference in population growth rate among watersheds even though it was expected because of physical differences as a result of differing canopy cover and watershed width. Elasticity analyses of Lepanthes rupestris suggest that growth rate is more sensitive to survival followed by growth, shrinking and the reproductive rates. The Bayesian approach helped to estimate transition probabilities that were uncommon or variable in some populations. Moreover, it increased the precision of the parameter estimates as compared to traditional approaches. PMID:25068598

  3. Estimation of the probability of exposure to machining fluids in a population-based case-control study.

    PubMed

    Park, Dong-Uk; Colt, Joanne S; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R; Armenti, Karla R; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (0.1->0.9%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and U.S. production levels found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resources: a list of keywords in the occupational histories that were used to link study subjects to the metalworking fluids (MWFs) modules; recommendations from the literature on selection of MWFs based on type of machining operation, the metal being machined and decade; popular additives to MWFs; the number and proportion of controls who

  4. Estimating the Transitional Probabilities of Smoking Stages with Cross-sectional Data and 10-Year Projection for Smoking Behavior in Iranian Adolescents

    PubMed Central

    Khosravi, Ahmad; Mansournia, Mohammad Ali; Mahmoodi, Mahmood; Pouyan, Ali Akbar; Holakouie-Naieni, Kourosh

    2016-01-01

    Background: Cigarette smoking is one of the most important health-related risk factors in terms of morbidity and mortality. In this study, we introduced a new method for deriving the transitional probabilities of smoking stages from a cross-sectional study and simulated a long-term smoking behavior for adolescents. Methods: In this study in 2010, a total of 4853 high school students were randomly selected and were completed a self-administered questionnaire about cigarette smoking. We used smoothed age- and sex-specific prevalence of smoking stages in a probabilistic discrete event system for estimating of transitional probabilities. A nonhomogenous discrete time Markov chain analysis was used to model the progression of the smoking in 10 years ahead in the same population. The mean age of the students was 15.69 ± 0.73 years (range: 14–19). Results: The smoothed prevalence proportion of current smoking varies between 3.58 and 26.14%. The age-adjusted odds of initiation in boys is 8.9 (95% confidence interval [CI]: 7.9–10.0) times of the odds of initiation of smoking in girls. Our study predicted that the prevalence proportion of current smokers increased from 7.55% in 2010 to 20.31% (95% CI: 19.44–21.37) for 2019. Conclusions: The present study showed a moderately but concerning prevalence of current smoking in Iranian adolescents and introduced a novel method for estimation of transitional probabilities from a cross-sectional study. The increasing trend of cigarette use among adolescents indicated the necessity of paying more attention to this group. PMID:27625766

  5. A FRAX Experience in Korea: Fracture Risk Probabilities with a Country-specific Versus a Surrogate Model

    PubMed Central

    Min, Yong-Ki; Lee, Dong-Yun; Park, Youn-Soo; Moon, Young-Wan; Lim, Seung-Jae; Lee, Young-Kyun; Choi, DooSeok

    2015-01-01

    Background Recently, a Korean fracture-risk assessment tool (FRAX) model has become available, but large prospective cohort studies, which are needed to validate the model, are still lacking, and there has been little effort to evaluate its usefulness. This study evaluated the clinical usefulness of the FRAX model, a FRAX developed by the World Health Organization, in Korea. Methods In 405 postmenopausal women and 139 men with a proximal femoral fracture, 10-year predicted fracture probabilities calculated by the Korean FRAX model (a country-specific model) were compared with the probabilities calculated with a FRAX model for Japan, which has a similar ethnic background (surrogate model). Results The 10-year probabilities of major osteoporotic and hip fractures calculated by the Korean model were significantly lower than those calculated by the Japanese model in women and men. The fracture probabilities calculated by each model increased significantly with age in both sexes. In patients aged 70 or older, however, there was a significant difference between the two models. In addition, the Korean model led to lower probabilities for major osteoporotic fracture and hip fracture in women when BMD was excluded from the model than when it was included. Conclusions The 10-year fracture probabilities calculated with FRAX models might differ between country-specific and surrogate models, and caution is needed when applying a surrogate model to a new population. A large prospective study is warranted to validate the country-specific Korean model in the general population. PMID:26389086

  6. On cancer risk estimation of urban air pollution.

    PubMed Central

    Törnqvist, M; Ehrenberg, L

    1994-01-01

    The usefulness of data from various sources for a cancer risk estimation of urban air pollution is discussed. Considering the irreversibility of initiations, a multiplicative model is preferred for solid tumors. As has been concluded for exposure to ionizing radiation, the multiplicative model, in comparison with the additive model, predicts a relatively larger number of cases at high ages, with enhanced underestimation of risks by short follow-up times in disease-epidemiological studies. For related reasons, the extrapolation of risk from animal tests on the basis of daily absorbed dose per kilogram body weight or per square meter surface area without considering differences in life span may lead to an underestimation, and agreements with epidemiologically determined values may be fortuitous. Considering these possibilities, the most likely lifetime risks of cancer death at the average exposure levels in Sweden were estimated for certain pollution fractions or indicator compounds in urban air. The risks amount to approximately 50 deaths per 100,000 for inhaled particulate organic material (POM), with a contribution from ingested POM about three times larger, and alkenes, and butadiene cause 20 deaths, respectively, per 100,000 individuals. Also, benzene and formaldehyde are expected to be associated with considerable risk increments. Comparative potency methods were applied for POM and alkenes. Due to incompleteness of the list of compounds considered and the uncertainties of the above estimates, the total risk calculation from urban air has not been attempted here. PMID:7821292

  7. Methods for estimating annual exceedance-probability discharges for streams in Iowa, based on data through water year 2010

    USGS Publications Warehouse

    Eash, David A.; Barnes, Kimberlee K.; Veilleux, Andrea G.

    2013-01-01

    A statewide study was performed to develop regional regression equations for estimating selected annual exceedance-probability statistics for ungaged stream sites in Iowa. The study area comprises streamgages located within Iowa and 50 miles beyond the State’s borders. Annual exceedance-probability estimates were computed for 518 streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data through 2010. The estimation of the selected statistics included a Bayesian weighted least-squares/generalized least-squares regression analysis to update regional skew coefficients for the 518 streamgages. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low flows. Also, geographic information system software was used to measure 59 selected basin characteristics for each streamgage. Regional regression analysis, using generalized least-squares regression, was used to develop a set of equations for each flood region in Iowa for estimating discharges for ungaged stream sites with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. A total of 394 streamgages were included in the development of regional regression equations for three flood regions (regions 1, 2, and 3) that were defined for Iowa based on landform regions and soil regions. Average standard errors of prediction range from 31.8 to 45.2 percent for flood region 1, 19.4 to 46.8 percent for flood region 2, and 26.5 to 43.1 percent for flood region 3. The pseudo coefficients of determination for the generalized least-squares equations range from 90.8 to 96.2 percent for flood region 1, 91.5 to 97

  8. Estimation of the standardized risk difference and ratio in a competing risks framework: application to injection drug use and progression to AIDS after initiation of antiretroviral therapy.

    PubMed

    Cole, Stephen R; Lau, Bryan; Eron, Joseph J; Brookhart, M Alan; Kitahata, Mari M; Martin, Jeffrey N; Mathews, William C; Mugavero, Michael J

    2015-02-15

    There are few published examples of absolute risk estimated from epidemiologic data subject to censoring and competing risks with adjustment for multiple confounders. We present an example estimating the effect of injection drug use on 6-year risk of acquired immunodeficiency syndrome (AIDS) after initiation of combination antiretroviral therapy between 1998 and 2012 in an 8-site US cohort study with death before AIDS as a competing risk. We estimate the risk standardized to the total study sample by combining inverse probability weights with the cumulative incidence function; estimates of precision are obtained by bootstrap. In 7,182 patients (83% male, 33% African American, median age of 38 years), we observed 6-year standardized AIDS risks of 16.75% among 1,143 injection drug users and 12.08% among 6,039 nonusers, yielding a standardized risk difference of 4.68 (95% confidence interval: 1.27, 8.08) and a standardized risk ratio of 1.39 (95% confidence interval: 1.12, 1.72). Results may be sensitive to the assumptions of exposure-version irrelevance, no measurement bias, and no unmeasured confounding. These limitations suggest that results be replicated with refined measurements of injection drug use. Nevertheless, estimating the standardized risk difference and ratio is straightforward, and injection drug use appears to increase the risk of AIDS. PMID:24966220

  9. Population-Attributable Risk Estimates for Risk Factors Associated with Campylobacter Infection, Australia

    PubMed Central

    Schluter, Philip J.; Wilson, Andrew J.; Kirk, Martyn D.; Hall, Gillian; Unicomb, Leanne

    2008-01-01

    In 2001–2002, a multicenter, prospective case-control study involving 1,714 participants >5 years of age was conducted in Australia to identify risk factors for Campylobacter infection. Adjusted population-attributable risks (PARs) were derived for each independent risk factor contained within the final multivariable logistic regression model. Estimated PARs were combined with adjusted (for the >5 years of age eligibility criterion) notifiable disease surveillance data to estimate annual Australian Campylobacter case numbers attributable to each risk factor. Simulated distributions of “credible values” were then generated to model the uncertainty associated with each case number estimate. Among foodborne risk factors, an estimated 50,500 (95% credible interval 10,000–105,500) cases of Campylobacter infection in persons >5 years of age could be directly attributed each year to consumption of chicken in Australia. Our statistical technique could be applied more widely to other communicable diseases that are subject to routine surveillance. PMID:18507899

  10. Interpretation of laboratory tests using a programmable hand-held calculator: calculation of posterior probability and relative risk on the basis of prior probability and sensitivity, specificity and result of the test.

    PubMed

    van de Merwe, J P

    1984-01-01

    A program is presented for the Hewlett-Packard HP- 41C programmable hand-held calculator that calculates the posterior probability and relative risk of an event (e.g. a disease) on the basis of prior probability (prevalence) and sensitivity, specificity and result of the test. PMID:6370577

  11. Studies on the extended Techa river cohort: cancer risk estimation

    SciTech Connect

    Kossenko, M M.; Preston, D L.; Krestinina, L Y.; Degteva, M O.; Startsev, N V.; Thomas, T; Vyushkova, O V.; Anspaugh, L R.; Napier, Bruce A. ); Kozheurov, V P.; Ron, E; Akleyev, A V.

    2001-12-01

    Initial population-based studies of riverside residents were begun in the late 1950s and in 1967 a systematic effort was undertaken to develop a well-defined fixed cohort of Techa river residents, to carry out ongoing mortality and (limited) clinical follow-up of this cohort, and to provide individualized dose estimates for cohort members. Over the past decade, extensive efforts have been made to refine the cohort definition and improve both the follow-up and dosimetry data. Analyses of the Techa river cohort can provide useful quantitative estimates of the effects of low dose rate, chronic external and internal exposures on cancer mortality and incidence and non-cancer mortality rates. These risk estimates complement quantitative risk estimates for acute exposures based on the atomic bomb survivors and chronic exposure risk estimates from worker studies, including Mayak workers and other groups with occupational radiation exposures. As the dosimetry and follow-up are refined it may also be possible to gain useful insights into risks associated with 90Sr exposures.

  12. Depression, substance use and HIV risk in a probability sample of men who have sex with men.

    PubMed

    Fendrich, Michael; Avci, Ozgur; Johnson, Timothy P; Mackesy-Amiti, Mary Ellen

    2013-03-01

    The persistent HIV epidemic among men who have sex with men (MSM) suggests that continued research on factors associated with risky sexual behavior is necessary. Drawing on prior literature, the role of depression and substance use in HIV risk is also inconclusive. Generalizability of past findings may also be limited to the extent that research has not employed probability samples. Here we report on one of the few probability samples of MSM to examine the role of depressive symptoms and substance use on risky sexual behavior (RSB). Multinomial logistic regression analysis suggested that depression and substance use are independently linked to our risk measure, such that those reporting high levels of depressive symptoms or substance use were more likely to report both unprotected receptive anal intercourse and unprotected insertive anal intercourse, and sex with a risky partner. Implications for prevention and treatment are discussed. PMID:23254224

  13. Peers Increase Adolescent Risk Taking Even When the Probabilities of Negative Outcomes Are Known

    ERIC Educational Resources Information Center

    Smith, Ashley R.; Chein, Jason; Steinberg, Laurence

    2014-01-01

    The majority of adolescent risk taking occurs in the presence of peers, and recent research suggests that the presence of peers may alter how the potential rewards and costs of a decision are valuated or perceived. The current study further explores this notion by investigating how peer observation affects adolescent risk taking when the…

  14. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    PubMed Central

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. Methods We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. Results MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%–95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. Conclusions National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. PMID:26965869

  15. The Asymptotics of Recovery Probability in the Dual Renewal Risk Model with Constant Interest and Debit Force

    PubMed Central

    2015-01-01

    The asymptotic behavior of the recovery probability for the dual renewal risk model with constant interest and debit force is studied. By means the idea of Markov Skeleton method, we studied the times that the random premium incomes happened and transformed the continuous time model into a discrete time model. By investigating the fluctuations of this discrete time model, we obtained the asymptotic behavior when the random premium income belongs to a kind of heavy-tailed distributions.

  16. Context-adaptive binary arithmetic coding with precise probability estimation and complexity scalability for high-efficiency video coding

    NASA Astrophysics Data System (ADS)

    Karwowski, Damian; Domański, Marek

    2016-01-01

    An improved context-based adaptive binary arithmetic coding (CABAC) is presented. The idea for the improvement is to use a more accurate mechanism for estimation of symbol probabilities in the standard CABAC algorithm. The authors' proposal of such a mechanism is based on the context-tree weighting technique. In the framework of a high-efficiency video coding (HEVC) video encoder, the improved CABAC allows 0.7% to 4.5% bitrate saving compared to the original CABAC algorithm. The application of the proposed algorithm marginally affects the complexity of HEVC video encoder, but the complexity of video decoder increases by 32% to 38%. In order to decrease the complexity of video decoding, a new tool has been proposed for the improved CABAC that enables scaling of the decoder complexity. Experiments show that this tool gives 5% to 7.5% reduction of the decoding time while still maintaining high efficiency in the data compression.

  17. Sensitivity of risk estimates to wildlife bioaccumulation factors in ecological risk assessment

    SciTech Connect

    Karustis, C.G.; Brewer, R.A.

    1995-12-31

    The concept of conservatism in risk assessment is well established. However, overly conservative assumptions may result in risk estimates that incorrectly predict remediation goals. Therefore, realistic assumptions should be applied in risk assessment whenever possible. A sensitivity analysis was performed on conservative (i.e. bioaccumulation factor = 1) and scientifically-derived wildlife bioaccumulation factors (BAFs) utilized to calculate risks during a terrestrial ecological risk assessment (ERA). In the first approach, 100% bioaccumulation of contaminants was assumed to estimate the transfer of contaminants through the terrestrial food chain. In the second approach, scientifically-derived BAFs were selected from the literature. For one of the measurement species selected, total risks calculated during the first approach were higher than those calculated during the second approach by two orders of magnitude. However, potential risks due to individual contaminants were not necessarily higher using the conservative approach. Potential risk due to contaminants with low actual bioaccumulation were exaggerated while potential risks due to contaminants with greater than 100% bioaccumulation were underestimated. Therefore, the use of a default of 100% bioaccumulation (BAF = 1) for all contaminants encountered during an ERA could result in cases where contaminants are incorrectly identified as risk drivers, and the calculation of incorrect ecological risk-based cleanup goals. The authors suggest using site-specific or literature-derived BAFs whenever possible and realistic BAF estimates, based upon factors such as log K{sub ow}, when BAFs are unavailable.

  18. Relation of probability of causation to relative risk and doubling dose: a methodologic error that has become a social problem.

    PubMed Central

    Greenland, S

    1999-01-01

    Epidemiologists, biostatisticians, and health physicists frequently serve as expert consultants to lawyers, courts, and administrators. One of the most common errors committed by experts is to equate, without qualification, the attributable fraction estimated from epidemiologic data to the probability of causation requested by courts and administrators. This error has become so pervasive that it has been incorporated into judicial precedents and legislation. This commentary provides a brief overview of the error and the context in which it arises. PMID:10432900

  19. Reconstruction of financial networks for robust estimation of systemic risk

    NASA Astrophysics Data System (ADS)

    Mastromatteo, Iacopo; Zarinelli, Elia; Marsili, Matteo

    2012-03-01

    In this paper we estimate the propagation of liquidity shocks through interbank markets when the information about the underlying credit network is incomplete. We show that techniques such as maximum entropy currently used to reconstruct credit networks severely underestimate the risk of contagion by assuming a trivial (fully connected) topology, a type of network structure which can be very different from the one empirically observed. We propose an efficient message-passing algorithm to explore the space of possible network structures and show that a correct estimation of the network degree of connectedness leads to more reliable estimations for systemic risk. Such an algorithm is also able to produce maximally fragile structures, providing a practical upper bound for the risk of contagion when the actual network structure is unknown. We test our algorithm on ensembles of synthetic data encoding some features of real financial networks (sparsity and heterogeneity), finding that more accurate estimations of risk can be achieved. Finally we find that this algorithm can be used to control the amount of information that regulators need to require from banks in order to sufficiently constrain the reconstruction of financial networks.

  20. Neoplastic potential of gastric irradiation. IV. Risk estimates

    SciTech Connect

    Griem, M.L.; Justman, J.; Weiss, L.

    1984-12-01

    No significant tumor increase was found in the initial analysis of patients irradiated for peptic ulcer and followed through 1962. A preliminary study was undertaken 22 years later to estimate the risk of cancer due to gastric irradiation for peptic ulcer disease. A population of 2,049 irradiated patients and 763 medically managed patients has been identified. A relative risk of 3.7 was found for stomach cancer and an initial risk estimate of 5.5 x 10(-6) excess stomach cancers per person rad was calculated. A more complete follow-up is in progress to further elucidate this observation and decrease the ascertainment bias; however, preliminary data are in agreement with the Japanese atomic bomb reports.

  1. The use of individual and societal risk criteria within the Dutch flood safety policy--nationwide estimates of societal risk and policy applications.

    PubMed

    Jonkman, Sebastiaan N; Jongejan, Ruben; Maaskant, Bob

    2011-02-01

    The Dutch government is in the process of revising its flood safety policy. The current safety standards for flood defenses in the Netherlands are largely based on the outcomes of cost-benefit analyses. Loss of life has not been considered separately in the choice for current standards. This article presents the results of a research project that evaluated the potential roles of two risk metrics, individual and societal risk, to support decision making about new flood safety standards. These risk metrics are already used in the Dutch major hazards policy for the evaluation of risks to the public. Individual risk concerns the annual probability of death of a person. Societal risk concerns the probability of an event with many fatalities. Technical aspects of the use of individual and societal risk metrics in flood risk assessments as well as policy implications are discussed. Preliminary estimates of nationwide levels of societal risk are presented. Societal risk levels appear relatively high in the southwestern part of the country where densely populated dike rings are threatened by a combination of river and coastal floods. It was found that cumulation, the simultaneous flooding of multiple dike rings during a single flood event, has significant impact on the national level of societal risk. Options for the application of the individual and societal risk in the new flood safety policy are presented and discussed. PMID:20883529

  2. Estimating cancer risk from dental cone-beam CT exposures based on skin dosimetry.

    PubMed

    Pauwels, Ruben; Cockmartin, Lesley; Ivanauskaité, Deimante; Urbonienė, Ausra; Gavala, Sophia; Donta, Catherine; Tsiklakis, Kostas; Jacobs, Reinhilde; Bosmans, Hilde; Bogaerts, Ria; Horner, Keith

    2014-07-21

    The aim of this study was to measure entrance skin doses on patients undergoing cone-beam computed tomography (CBCT) examinations, to establish conversion factors between skin and organ doses, and to estimate cancer risk from CBCT exposures. 266 patients (age 8-83) were included, involving three imaging centres. CBCT scans were acquired using the SCANORA 3D (Soredex, Tuusula, Finland) and NewTom 9000 (QR, Verona, Italy). Eight thermoluminescent dosimeters were attached to the patient's skin at standardized locations. Using previously published organ dose estimations on various CBCTs with an anthropomorphic phantom, correlation factors to convert skin dose to organ doses were calculated and applied to estimate patient organ doses. The BEIR VII age- and gender-dependent dose-risk model was applied to estimate the lifetime attributable cancer risk. For the SCANORA 3D, average skin doses over the eight locations varied between 484 and 1788 µGy. For the NewTom 9000 the range was between 821 and 1686 µGy for Centre 1 and between 292 and 2325 µGy for Centre 2. Entrance skin dose measurements demonstrated the combined effect of exposure and patient factors on the dose. The lifetime attributable cancer risk, expressed as the probability to develop a radiation-induced cancer, varied between 2.7 per million (age >60) and 9.8 per million (age 8-11) with an average of 6.0 per million. On average, the risk for female patients was 40% higher. The estimated radiation risk was primarily influenced by the age at exposure and the gender, pointing out the continuing need for justification and optimization of CBCT exposures, with a specific focus on children. PMID:24957710

  3. Estimating cancer risk from dental cone-beam CT exposures based on skin dosimetry

    NASA Astrophysics Data System (ADS)

    Pauwels, Ruben; Cockmartin, Lesley; Ivanauskaité, Deimante; Urbonienė, Ausra; Gavala, Sophia; Donta, Catherine; Tsiklakis, Kostas; Jacobs, Reinhilde; Bosmans, Hilde; Bogaerts, Ria; Horner, Keith; SEDENTEXCT Project Consortium, The

    2014-07-01

    The aim of this study was to measure entrance skin doses on patients undergoing cone-beam computed tomography (CBCT) examinations, to establish conversion factors between skin and organ doses, and to estimate cancer risk from CBCT exposures. 266 patients (age 8-83) were included, involving three imaging centres. CBCT scans were acquired using the SCANORA 3D (Soredex, Tuusula, Finland) and NewTom 9000 (QR, Verona, Italy). Eight thermoluminescent dosimeters were attached to the patient's skin at standardized locations. Using previously published organ dose estimations on various CBCTs with an anthropomorphic phantom, correlation factors to convert skin dose to organ doses were calculated and applied to estimate patient organ doses. The BEIR VII age- and gender-dependent dose-risk model was applied to estimate the lifetime attributable cancer risk. For the SCANORA 3D, average skin doses over the eight locations varied between 484 and 1788 µGy. For the NewTom 9000 the range was between 821 and 1686 µGy for Centre 1 and between 292 and 2325 µGy for Centre 2. Entrance skin dose measurements demonstrated the combined effect of exposure and patient factors on the dose. The lifetime attributable cancer risk, expressed as the probability to develop a radiation-induced cancer, varied between 2.7 per million (age >60) and 9.8 per million (age 8-11) with an average of 6.0 per million. On average, the risk for female patients was 40% higher. The estimated radiation risk was primarily influenced by the age at exposure and the gender, pointing out the continuing need for justification and optimization of CBCT exposures, with a specific focus on children.

  4. Estimation of myocardial volume at risk from CT angiography

    NASA Astrophysics Data System (ADS)

    Zhu, Liangjia; Gao, Yi; Mohan, Vandana; Stillman, Arthur; Faber, Tracy; Tannenbaum, Allen

    2011-03-01

    The determination of myocardial volume at risk distal to coronary stenosis provides important information for prognosis and treatment of coronary artery disease. In this paper, we present a novel computational framework for estimating the myocardial volume at risk in computed tomography angiography (CTA) imagery. Initially, epicardial and endocardial surfaces, and coronary arteries are extracted using an active contour method. Then, the extracted coronary arteries are projected onto the epicardial surface, and each point on this surface is associated with its closest coronary artery using the geodesic distance measurement. The likely myocardial region at risk on the epicardial surface caused by a stenosis is approximated by the region in which all its inner points are associated with the sub-branches distal to the stenosis on the coronary artery tree. Finally, the likely myocardial volume at risk is approximated by the volume in between the region at risk on the epicardial surface and its projection on the endocardial surface, which is expected to yield computational savings over risk volume estimation using the entire image volume. Furthermore, we expect increased accuracy since, as compared to prior work using the Euclidean distance, we employ the geodesic distance in this work. The experimental results demonstrate the effectiveness of the proposed approach on pig heart CTA datasets.

  5. CFD modelling of most probable bubble nucleation rate from binary mixture with estimation of components' mole fraction in critical cluster

    NASA Astrophysics Data System (ADS)

    Hong, Ban Zhen; Keong, Lau Kok; Shariff, Azmi Mohd

    2016-05-01

    The employment of different mathematical models to address specifically for the bubble nucleation rates of water vapour and dissolved air molecules is essential as the physics for them to form bubble nuclei is different. The available methods to calculate bubble nucleation rate in binary mixture such as density functional theory are complicated to be coupled along with computational fluid dynamics (CFD) approach. In addition, effect of dissolved gas concentration was neglected in most study for the prediction of bubble nucleation rates. The most probable bubble nucleation rate for the water vapour and dissolved air mixture in a 2D quasi-stable flow across a cavitating nozzle in current work was estimated via the statistical mean of all possible bubble nucleation rates of the mixture (different mole fractions of water vapour and dissolved air) and the corresponding number of molecules in critical cluster. Theoretically, the bubble nucleation rate is greatly dependent on components' mole fraction in a critical cluster. Hence, the dissolved gas concentration effect was included in current work. Besides, the possible bubble nucleation rates were predicted based on the calculated number of molecules required to form a critical cluster. The estimation of components' mole fraction in critical cluster for water vapour and dissolved air mixture was obtained by coupling the enhanced classical nucleation theory and CFD approach. In addition, the distribution of bubble nuclei of water vapour and dissolved air mixture could be predicted via the utilisation of population balance model.

  6. Potential confounds in estimating trial-to-trial correlations between neuronal response and behavior using choice probabilities

    PubMed Central

    Maunsell, John H. R.

    2012-01-01

    Correlations between trial-to-trial fluctuations in the responses of individual sensory neurons and perceptual reports, commonly quantified with choice probability (CP), have been widely used as an important tool for assessing the contributions of neurons to behavior. These correlations are usually weak and often require a large number of trials for a reliable estimate. Therefore, working with measures such as CP warrants care in data analysis as well as rigorous controls during data collection. Here we identify potential confounds that can arise in data analysis and lead to biased estimates of CP, and suggest methods to avoid the bias. In particular, we show that the common practice of combining neuronal responses across different stimulus conditions with z-score normalization can result in an underestimation of CP when the ratio of the numbers of trials for the two behavioral response categories differs across the stimulus conditions. We also discuss the effects of using variable time intervals for quantifying neuronal response on CP measurements. Finally, we demonstrate that serious artifacts can arise in reaction time tasks that use varying measurement intervals if the mean neuronal response and mean behavioral performance vary over time within trials. To emphasize the importance of addressing these concerns in neurophysiological data, we present a set of data collected from V1 cells in macaque monkeys while the animals performed a detection task. PMID:22993262

  7. Towards a Global Water Scarcity Risk Assessment Framework: Incorporation of Probability Distributions and Hydro-Climatic Variability

    NASA Technical Reports Server (NTRS)

    Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2016-01-01

    Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.

  8. Towards a global water scarcity risk assessment framework: incorporation of probability distributions and hydro-climatic variability

    NASA Astrophysics Data System (ADS)

    Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2016-02-01

    Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR’s definition of risk does not yet exist. This study provides a first step towards such a risk-based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to >56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.

  9. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri

    USGS Publications Warehouse

    Southard, Rodney E.; Veilleux, Andrea G.

    2014-01-01

    Regression analysis techniques were used to develop a set of equations for rural ungaged stream sites for estimating discharges with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. Basin and climatic characteristics were computed using geographic information software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses. Annual exceedance-probability discharge estimates were computed for 278 streamgages by using the expected moments algorithm to fit a log-Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data from water year 1844 to 2012. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized multiple Grubbs-Beck test was used to detect potentially influential low floods. Annual peak flows less than a minimum recordable discharge at a streamgage were incorporated into the at-site station analyses. An updated regional skew coefficient was determined for the State of Missouri using Bayesian weighted least-squares/generalized least squares regression analyses. At-site skew estimates for 108 long-term streamgages with 30 or more years of record and the 35 basin characteristics defined for this study were used to estimate the regional variability in skew. However, a constant generalized-skew value of -0.30 and a mean square error of 0.14 were determined in this study. Previous flood studies indicated that the distinct physical features of the three physiographic provinces have a pronounced effect on the magnitude of flood peaks. Trends in the magnitudes of the residuals from preliminary statewide regression analyses from previous studies confirmed that regional analyses in this study were

  10. Cancer risk estimates from radiation therapy for heterotopic ossification prophylaxis after total hip arthroplasty

    SciTech Connect

    Mazonakis, Michalis; Berris, Theoharris; Damilakis, John; Lyraraki, Efrossyni

    2013-10-15

    Purpose: Heterotopic ossification (HO) is a frequent complication following total hip arthroplasty. This study was conducted to calculate the radiation dose to organs-at-risk and estimate the probability of cancer induction from radiotherapy for HO prophylaxis.Methods: Hip irradiation for HO with a 6 MV photon beam was simulated with the aid of a Monte Carlo model. A realistic humanoid phantom representing an average adult patient was implemented in Monte Carlo environment for dosimetric calculations. The average out-of-field radiation dose to stomach, liver, lung, prostate, bladder, thyroid, breast, uterus, and ovary was calculated. The organ-equivalent-dose to colon, that was partly included within the treatment field, was also determined. Organ dose calculations were carried out using three different field sizes. The dependence of organ doses upon the block insertion into primary beam for shielding colon and prosthesis was investigated. The lifetime attributable risk for cancer development was estimated using organ, age, and gender-specific risk coefficients.Results: For a typical target dose of 7 Gy, organ doses varied from 1.0 to 741.1 mGy by the field dimensions and organ location relative to the field edge. Blocked field irradiations resulted in a dose range of 1.4–146.3 mGy. The most probable detriment from open field treatment of male patients was colon cancer with a high risk of 564.3 × 10{sup −5} to 837.4 × 10{sup −5} depending upon the organ dose magnitude and the patient's age. The corresponding colon cancer risk for female patients was (372.2–541.0) × 10{sup −5}. The probability of bladder cancer development was more than 113.7 × 10{sup −5} and 110.3 × 10{sup −5} for males and females, respectively. The cancer risk range to other individual organs was reduced to (0.003–68.5) × 10{sup −5}.Conclusions: The risk for cancer induction from radiation therapy for HO prophylaxis after total hip arthroplasty varies considerably by the

  11. Derivation of Failure Rates and Probability of Failures for the International Space Station Probabilistic Risk Assessment Study

    NASA Technical Reports Server (NTRS)

    Vitali, Roberto; Lutomski, Michael G.

    2004-01-01

    National Aeronautics and Space Administration s (NASA) International Space Station (ISS) Program uses Probabilistic Risk Assessment (PRA) as part of its Continuous Risk Management Process. It is used as a decision and management support tool to not only quantify risk for specific conditions, but more importantly comparing different operational and management options to determine the lowest risk option and provide rationale for management decisions. This paper presents the derivation of the probability distributions used to quantify the failure rates and the probability of failures of the basic events employed in the PRA model of the ISS. The paper will show how a Bayesian approach was used with different sources of data including the actual ISS on orbit failures to enhance the confidence in results of the PRA. As time progresses and more meaningful data is gathered from on orbit failures, an increasingly accurate failure rate probability distribution for the basic events of the ISS PRA model can be obtained. The ISS PRA has been developed by mapping the ISS critical systems such as propulsion, thermal control, or power generation into event sequences diagrams and fault trees. The lowest level of indenture of the fault trees was the orbital replacement units (ORU). The ORU level was chosen consistently with the level of statistically meaningful data that could be obtained from the aerospace industry and from the experts in the field. For example, data was gathered for the solenoid valves present in the propulsion system of the ISS. However valves themselves are composed of parts and the individual failure of these parts was not accounted for in the PRA model. In other words the failure of a spring within a valve was considered a failure of the valve itself.

  12. Development of new risk score for pre-test probability of obstructive coronary artery disease based on coronary CT angiography.

    PubMed

    Fujimoto, Shinichiro; Kondo, Takeshi; Yamamoto, Hideya; Yokoyama, Naoyuki; Tarutani, Yasuhiro; Takamura, Kazuhisa; Urabe, Yoji; Konno, Kumiko; Nishizaki, Yuji; Shinozaki, Tomohiro; Kihara, Yasuki; Daida, Hiroyuki; Isshiki, Takaaki; Takase, Shinichi

    2015-09-01

    Existing methods to calculate pre-test probability of obstructive coronary artery disease (CAD) have been established using selected high-risk patients who were referred to conventional coronary angiography. The purpose of this study is to develop and validate our new method for pre-test probability of obstructive CAD using patients who underwent coronary CT angiography (CTA), which could be applicable to a wider range of patient population. Using consecutive 4137 patients with suspected CAD who underwent coronary CTA at our institution, a multivariate logistic regression model including clinical factors as covariates calculated the pre-test probability (K-score) of obstructive CAD determined by coronary CTA. The K-score was compared with the Duke clinical score using the area under the curve (AUC) for the receiver-operating characteristic curve. External validation was performed by an independent sample of 319 patients. The final model included eight significant predictors: age, gender, coronary risk factor (hypertension, diabetes mellitus, dyslipidemia, smoking), history of cerebral infarction, and chest symptom. The AUC of the K-score was significantly greater than that of the Duke clinical score for both derivation (0.736 vs. 0.699) and validation (0.714 vs. 0.688) data sets. Among patients who underwent coronary CTA, newly developed K-score had better pre-test prediction ability of obstructive CAD compared to Duke clinical score in Japanese population. PMID:24770610

  13. Estimation of wildfire size and risk changes due to fuels treatments

    USGS Publications Warehouse

    Cochrane, M.A.; Moran, C.J.; Wimberly, M.C.; Baer, A.D.; Finney, M.A.; Beckendorf, K.L.; Eidenshink, J.; Zhu, Z.

    2012-01-01

    Human land use practices, altered climates, and shifting forest and fire management policies have increased the frequency of large wildfires several-fold. Mitigation of potential fire behaviour and fire severity have increasingly been attempted through pre-fire alteration of wildland fuels using mechanical treatments and prescribed fires. Despite annual treatment of more than a million hectares of land, quantitative assessments of the effectiveness of existing fuel treatments at reducing the size of actual wildfires or how they might alter the risk of burning across landscapes are currently lacking. Here, we present a method for estimating spatial probabilities of burning as a function of extant fuels treatments for any wildland fire-affected landscape. We examined the landscape effects of more than 72 000 ha of wildland fuel treatments involved in 14 large wildfires that burned 314 000 ha of forests in nine US states between 2002 and 2010. Fuels treatments altered the probability of fire occurrence both positively and negatively across landscapes, effectively redistributing fire risk by changing surface fire spread rates and reducing the likelihood of crowning behaviour. Trade offs are created between formation of large areas with low probabilities of increased burning and smaller, well-defined regions with reduced fire risk.

  14. At Risk of What? Possibilities over Probabilities in the Study of Young Lives

    ERIC Educational Resources Information Center

    Foster, Karen Rebecca; Spencer, Dale

    2011-01-01

    This paper draws on a series of 45 interviews with recipients of social assistance between the ages of 16 and 24 to offer a critical assessment of the language of "risk" and "resilience." After briefly tracing the development of this vocabulary and approach in youth research, this paper argues in line with existing critiques (Kelly 2000, te Riele…

  15. Numeracy, Ratio Bias, and Denominator Neglect in Judgments of Risk and Probability

    ERIC Educational Resources Information Center

    Reyna, Valerie F.; Brainerd, Charles J.

    2008-01-01

    "Numeracy," so-called on analogy with literacy, is essential for making health and other social judgments in everyday life [Reyna, V. F., & Brainerd, C. J. (in press). The importance of mathematics in health and human judgment: Numeracy, risk communication, and medical decision making. "Learning and Individual Differences."]. Recent research on…

  16. The economic value of reducing environmental health risks: Contingent valuation estimates of the value of information

    SciTech Connect

    Krieger, D.J.; Hoehn, J.P.

    1999-05-01

    Obtaining economically consistent values for changes in low probability health risks continues to be a challenge for contingent valuation (CV) as well as for other valuation methods. One of the cited condition for economic consistency is that estimated values be sensitive to the scope (differences in quantity or quality) of a good described in a CV application. The alleged limitations of CV pose a particular problem for environmental managers who must often make decisions that affect human health risks. This paper demonstrates that a well-designed CV application can elicit scope sensitive values even for programs that provide conceptually complex goods such as risk reduction. Specifically, it finds that the amount sport anglers are willing to pay for information about chemical residues in fish varies systematically with informativeness--a relationship suggested by the theory of information value.

  17. Estimation of the environmental risk of regulated river flow

    NASA Astrophysics Data System (ADS)

    Latu, Kilisimasi; Malano, Hector M.; Costelloe, Justin F.; Peterson, Tim J.

    2014-09-01

    A commonly accepted paradigm in environmental flow management is that a regulated river flow regime should mimic the natural hydrological regime to sustain the key attributes of freshwater ecosystems. Estimation of the environmental risk arising from flow regulation needs to consider all aspects of the flow regime when applied to water allocation decisions. We present a holistic, dynamic and robust approach that is based on a statistical analysis of the entire flow regime and accounts for flow stress indicators to produce an environmental risk time series based on the consequence of departures from the optimum flow range of a river or reach. When applied to a catchment, (Campaspe River, southern Australia) the model produced a dynamic and robust environmental risk time series that clearly showed that when the observed river flow is drawn away from the optimum range of environmental flow demand, the environmental risk increased. In addition, the model produced risk time series showing that the Campaspe River has reversed seasonal patterns of river flow due to water releases during summer periods, which altered the flow nature of the river. Hence, this resulted in higher environmental risk occurring during summer but lower in winter periods. Furthermore, we found that the vulnerability and coefficient of variation indices have the highest contributions to consequence in comparison to other indices used to calculate environmental risk.

  18. Estimation of earthquake risk curves of physical building damage

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias; Janouschkowetz, Silke; Fischer, Thomas; Simon, Christian

    2014-05-01

    In this study, a new approach to quantify seismic risks is presented. Here, the earthquake risk curves for the number of buildings with a defined physical damage state are estimated for South Africa. Therein, we define the physical damage states according to the current European macro-seismic intensity scale (EMS-98). The advantage of such kind of risk curve is that its plausibility can be checked more easily than for other types. The earthquake risk curve for physical building damage can be compared with historical damage and their corresponding empirical return periods. The number of damaged buildings from historical events is generally explored and documented in more detail than the corresponding monetary losses. The latter are also influenced by different economic conditions, such as inflation and price hikes. Further on, the monetary risk curve can be derived from the developed risk curve of physical building damage. The earthquake risk curve can also be used for the validation of underlying sub-models such as the hazard and vulnerability modules.

  19. Methods for estimating annual exceedance probability discharges for streams in Arkansas, based on data through water year 2013

    USGS Publications Warehouse

    Wagner, Daniel M.; Krieger, Joshua D.; Veilleux, Andrea G.

    2016-01-01

    In 2013, the U.S. Geological Survey initiated a study to update regional skew, annual exceedance probability discharges, and regional regression equations used to estimate annual exceedance probability discharges for ungaged locations on streams in the study area with the use of recent geospatial data, new analytical methods, and available annual peak-discharge data through the 2013 water year. An analysis of regional skew using Bayesian weighted least-squares/Bayesian generalized-least squares regression was performed for Arkansas, Louisiana, and parts of Missouri and Oklahoma. The newly developed constant regional skew of -0.17 was used in the computation of annual exceedance probability discharges for 281 streamgages used in the regional regression analysis. Based on analysis of covariance, four flood regions were identified for use in the generation of regional regression models. Thirty-nine basin characteristics were considered as potential explanatory variables, and ordinary least-squares regression techniques were used to determine the optimum combinations of basin characteristics for each of the four regions. Basin characteristics in candidate models were evaluated based on multicollinearity with other basin characteristics (variance inflation factor < 2.5) and statistical significance at the 95-percent confidence level (p ≤ 0.05). Generalized least-squares regression was used to develop the final regression models for each flood region. Average standard errors of prediction of the generalized least-squares models ranged from 32.76 to 59.53 percent, with the largest range in flood region D. Pseudo coefficients of determination of the generalized least-squares models ranged from 90.29 to 97.28 percent, with the largest range also in flood region D. The regional regression equations apply only to locations on streams in Arkansas where annual peak discharges are not substantially affected by regulation, diversion, channelization, backwater, or urbanization

  20. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing

    PubMed Central

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C

    2016-01-01

    Background Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Objective Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. Methods We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). Results We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. Conclusions CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk. PMID:26800642

  1. A Review of Expertise and Judgment Processes for Risk Estimation

    SciTech Connect

    R. L. Boring

    2007-06-01

    A major challenge of risk and reliability analysis for human errors or hardware failures is the need to enlist expert opinion in areas for which adequate operational data are not available. Experts enlisted in this capacity provide probabilistic estimates of reliability, typically comprised of a measure of central tendency and uncertainty bounds. While formal guidelines for expert elicitation are readily available, they largely fail to provide a theoretical basis for expertise and judgment. This paper reviews expertise and judgment in the context of risk analysis; overviews judgment biases, the role of training, and multivariate judgments; and provides guidance on the appropriate use of atomistic and holistic judgment processes.

  2. Probability Distribution Estimated From the Minimum, Maximum, and Most Likely Values: Applied to Turbine Inlet Temperature Uncertainty

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.

    2004-01-01

    Modern engineering design practices are tending more toward the treatment of design parameters as random variables as opposed to fixed, or deterministic, values. The probabilistic design approach attempts to account for the uncertainty in design parameters by representing them as a distribution of values rather than as a single value. The motivations for this effort include preventing excessive overdesign as well as assessing and assuring reliability, both of which are important for aerospace applications. However, the determination of the probability distribution is a fundamental problem in reliability analysis. A random variable is often defined by the parameters of the theoretical distribution function that gives the best fit to experimental data. In many cases the distribution must be assumed from very limited information or data. Often the types of information that are available or reasonably estimated are the minimum, maximum, and most likely values of the design parameter. For these situations the beta distribution model is very convenient because the parameters that define the distribution can be easily determined from these three pieces of information. Widely used in the field of operations research, the beta model is very flexible and is also useful for estimating the mean and standard deviation of a random variable given only the aforementioned three values. However, an assumption is required to determine the four parameters of the beta distribution from only these three pieces of information (some of the more common distributions, like the normal, lognormal, gamma, and Weibull distributions, have two or three parameters). The conventional method assumes that the standard deviation is a certain fraction of the range. The beta parameters are then determined by solving a set of equations simultaneously. A new method developed in-house at the NASA Glenn Research Center assumes a value for one of the beta shape parameters based on an analogy with the normal

  3. Quantitative microbial risk assessment combined with hydrodynamic modelling to estimate the public health risk associated with bathing after rainfall events.

    PubMed

    Eregno, Fasil Ejigu; Tryland, Ingun; Tjomsland, Torulv; Myrmel, Mette; Robertson, Lucy; Heistad, Arve

    2016-04-01

    This study investigated the public health risk from exposure to infectious microorganisms at Sandvika recreational beaches, Norway and dose-response relationships by combining hydrodynamic modelling with Quantitative Microbial Risk Assessment (QMRA). Meteorological and hydrological data were collected to produce a calibrated hydrodynamic model using Escherichia coli as an indicator of faecal contamination. Based on average concentrations of reference pathogens (norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium) relative to E. coli in Norwegian sewage from previous studies, the hydrodynamic model was used for simulating the concentrations of pathogens at the local beaches during and after a heavy rainfall event, using three different decay rates. The simulated concentrations were used as input for QMRA and the public health risk was estimated as probability of infection from a single exposure of bathers during the three consecutive days after the rainfall event. The level of risk on the first day after the rainfall event was acceptable for the bacterial and parasitic reference pathogens, but high for the viral reference pathogen at all beaches, and severe at Kalvøya-small and Kalvøya-big beaches, supporting the advice of avoiding swimming in the day(s) after heavy rainfall. The study demonstrates the potential of combining discharge-based hydrodynamic modelling with QMRA in the context of bathing water as a tool to evaluate public health risk and support beach management decisions. PMID:26802355

  4. Model Assembly for Estimating Cell Surviving Fraction for Both Targeted and Nontargeted Effects Based on Microdosimetric Probability Densities

    PubMed Central

    Sato, Tatsuhiko; Hamada, Nobuyuki

    2014-01-01

    We here propose a new model assembly for estimating the surviving fraction of cells irradiated with various types of ionizing radiation, considering both targeted and nontargeted effects in the same framework. The probability densities of specific energies in two scales, which are the cell nucleus and its substructure called a domain, were employed as the physical index for characterizing the radiation fields. In the model assembly, our previously established double stochastic microdosimetric kinetic (DSMK) model was used to express the targeted effect, whereas a newly developed model was used to express the nontargeted effect. The radioresistance caused by overexpression of anti-apoptotic protein Bcl-2 known to frequently occur in human cancer was also considered by introducing the concept of the adaptive response in the DSMK model. The accuracy of the model assembly was examined by comparing the computationally and experimentally determined surviving fraction of Bcl-2 cells (Bcl-2 overexpressing HeLa cells) and Neo cells (neomycin resistant gene-expressing HeLa cells) irradiated with microbeam or broadbeam of energetic heavy ions, as well as the WI-38 normal human fibroblasts irradiated with X-ray microbeam. The model assembly reproduced very well the experimentally determined surviving fraction over a wide range of dose and linear energy transfer (LET) values. Our newly established model assembly will be worth being incorporated into treatment planning systems for heavy-ion therapy, brachytherapy, and boron neutron capture therapy, given critical roles of the frequent Bcl-2 overexpression and the nontargeted effect in estimating therapeutic outcomes and harmful effects of such advanced therapeutic modalities. PMID:25426641

  5. Model assembly for estimating cell surviving fraction for both targeted and nontargeted effects based on microdosimetric probability densities.

    PubMed

    Sato, Tatsuhiko; Hamada, Nobuyuki

    2014-01-01

    We here propose a new model assembly for estimating the surviving fraction of cells irradiated with various types of ionizing radiation, considering both targeted and nontargeted effects in the same framework. The probability densities of specific energies in two scales, which are the cell nucleus and its substructure called a domain, were employed as the physical index for characterizing the radiation fields. In the model assembly, our previously established double stochastic microdosimetric kinetic (DSMK) model was used to express the targeted effect, whereas a newly developed model was used to express the nontargeted effect. The radioresistance caused by overexpression of anti-apoptotic protein Bcl-2 known to frequently occur in human cancer was also considered by introducing the concept of the adaptive response in the DSMK model. The accuracy of the model assembly was examined by comparing the computationally and experimentally determined surviving fraction of Bcl-2 cells (Bcl-2 overexpressing HeLa cells) and Neo cells (neomycin resistant gene-expressing HeLa cells) irradiated with microbeam or broadbeam of energetic heavy ions, as well as the WI-38 normal human fibroblasts irradiated with X-ray microbeam. The model assembly reproduced very well the experimentally determined surviving fraction over a wide range of dose and linear energy transfer (LET) values. Our newly established model assembly will be worth being incorporated into treatment planning systems for heavy-ion therapy, brachytherapy, and boron neutron capture therapy, given critical roles of the frequent Bcl-2 overexpression and the nontargeted effect in estimating therapeutic outcomes and harmful effects of such advanced therapeutic modalities. PMID:25426641

  6. Maximum likelihood estimation of label imperfection probabilities and its use in the identification of mislabeled patterns. [with application to Landsat MSS data processing

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1980-01-01

    Estimating label imperfections and the use of estimations in the identification of mislabeled patterns are discussed. Expressions are presented for the asymptotic variances of the probability of correct classification and proportion, and for the maximum likelihood estimates of classification errors and a priori probabilities. Models are developed for imperfections in the labels and classification errors, and expressions are derived for the probability of imperfect label identification schemes resulting in wrong decisions. The expressions are used in computing thresholds and the techniques are given practical applications. The imperfect label identification scheme in the multiclass case is found to amount to establishing a region around each decision surface, and decisions of the label correction scheme are found in close agreement with the analyst-interpreter interpretations of the imagery films. As an example, the application of the maximum likelihood estimation to the processing of Landsat MSS data is discussed.

  7. Estimating debris-flow probability using fan stratigraphy, historic records, and drainage-basin morphology, Interstate 70 highway corridor, central Colorado, U.S.A

    USGS Publications Warehouse

    Coe, J.A.; Godt, J.W.; Parise, M.; Moscariello, A.

    2003-01-01

    We have used stratigraphic and historic records of debris-flows to estimate mean recurrence intervals of past debris-flow events on 19 fans along the Interstate 70 highway corridor in the Front Range of Colorado. Estimated mean recurrence intervals were used in the Poisson probability model to estimate the probability of future debris-flow events on the fans. Mean recurrence intervals range from 7 to about 2900 years. Annual probabilities range from less than 0.1% to about 13%. A regression analysis of mean recurrence interval data and drainage-basin morphometry yields a regression model that may be suitable to estimate mean recurrence intervals on fans with no stratigraphic or historic records. Additional work is needed to verify this model. ?? 2003 Millpress.

  8. Clinical probability and risk analysis of patients with suspected pulmonary embolism

    PubMed Central

    Yetgin, Gulden Ozeren; Aydin, Sule Akkose; Koksal, Ozlem; Ozdemir, Fatma; Mert, Dilek Kostak; Torun, Gokhan

    2014-01-01

    BACKGROUND: Pulmonary embolism (PE) is one of the most frequent diseases that could be missed in overcrowded emergency departments as in Turkey. Early and accurate diagnosis could decrease the mortality rate and this standard algorithm should be defined. This study is to find the accurate, fast, non-invasive, cost-effective, easy-to-access diagnostic tests, clinical scoring systems and the patients who should be tested for clinical diagnosis of PE in emergency department. METHODS: One hundred and forty patients admitted to the emergency department with the final diagnosis of PE regarding to anamnesis, physical examination and risk factors, were included in this prospective, cross-sectional study. The patients with a diagnosis of pulmonary embolism, acute coronary syndrome or infection and chronic obstructive pulmonary disease (COPD) were excluded from the study. The demographics, risk factors, radiological findings, vital signs, symptoms, physical-laboratory findings, diagnostic tests and clinical scoring systems of patients (Wells and Geneva) were noted. The diagnostic criteria for pulmonary emboli were: filling defect in the pulmonary artery lumen on spiral computed tomographic angiography and perfusion defect on perfusion scintigraphy. RESULTS: Totally, 90 (64%) of the patients had PE. Age, hypotension, having deep vein thrombosis were the risk factors, and oxygen saturation, shock index, BNP, troponin and fibrinogen levels as for the biochemical parameters were significantly different between the PE (+) and PE (−) groups (P<0.05). The Wells scoring system was more successful than the other scoring systems. CONCLUSION: Biochemical parameters, clinical findings, and scoring systems, when used altogether, can contribute to the diagnosis of PE. PMID:25548599

  9. Aftershocks hazard in Italy Part I: Estimation of time-magnitude distribution model parameters and computation of probabilities of occurrence

    NASA Astrophysics Data System (ADS)

    Lolli, Barbara; Gasperini, Paolo

    We analyzed the available instrumental data on Italian earthquakes from1960 to 1996 to compute the parameters of the time-magnitudedistribution model proposed by Reasenberg and Jones (1989) andcurrently used to make aftershock forecasting in California. From 1981 to1996 we used the recently released Catalogo Strumentale deiTerremoti `Italiani' (CSTI) (Instrumental Catalog Working Group, 2001)joining the data of the Istituto Nazionale di Geofisica e Vulcanologia(INGV) and of the Italian major local seismic network, with magnituderevalued according to Gasperini (2001). From 1960 to 1980 we usedinstead the Progetto Finalizzato Geodinamica (PFG) catalog(Postpischl, 1985) with magnitude corrected to be homogeneous with thefollowing period. About 40 sequences are detected using two differentalgorithms and the results of the modeling for the corresponding ones arecompared. The average values of distribution parameters (p= 0.93±0.21, Log10(c) = -1.53±0.54, b = 0.96±0.18 and a = -1.66±0.72) are in fair agreementwith similar computations performed in other regions of the World. We alsoanalyzed the spatial variation of model parameters that can be used topredict the sequence behavior in the first days of future Italian seismic crisis,before a reliable modeling of the ongoing sequence is available. Moreoversome nomograms to expeditiously estimate probabilities and rates ofaftershock in Italy are also computed.

  10. Risk estimation of infectious diseases determines the effectiveness of the control strategy

    NASA Astrophysics Data System (ADS)

    Zhang, Haifeng; Zhang, Jie; Li, Ping; Small, Michael; Wang, Binghong

    2011-05-01

    Usually, whether to take vaccination or not is a voluntary decision, which is determined by many factors, from societal factors (such as religious belief and human rights) to individual preferences (including psychology and altruism). Facing the outbreaks of infectious diseases, different people often have different estimations on the risk of infectious diseases. So, some persons are willing to vaccinate, but other persons are willing to take risks. In this paper, we establish two different risk assessment systems using the technique of dynamic programming, and then compare the effects of the two different systems on the prevention of diseases on complex networks. One is that the perceived probability of being infected for each individual is the same (uniform case). The other is that the perceived probability of being infected is positively correlated to individual degrees (preferential case). We show that these two risk assessment systems can yield completely different results, such as, the effectiveness of controlling diseases, the time evolution of the number of infections, and so on.

  11. Use of binary logistic regression technique with MODIS data to estimate wild fire risk

    NASA Astrophysics Data System (ADS)

    Fan, Hong; Di, Liping; Yang, Wenli; Bonnlander, Brian; Li, Xiaoyan

    2007-11-01

    Many forest fires occur across the globe each year, which destroy life and property, and strongly impact ecosystems. In recent years, wildland fires and altered fire disturbance regimes have become a significant management and science problem affecting ecosystems and wildland/urban interface cross the United States and global. In this paper, we discuss the estimation of 504 probability models for forecasting fire risk for 14 fuel types, 12 months, one day/week/month in advance, which use 19 years of historical fire data in addition to meteorological and vegetation variables. MODIS land products are utilized as a major data source, and a logistical binary regression was adopted to solve fire forecast probability. In order to better modeling the change of fire risk along with the transition of seasons, some spatial and temporal stratification strategies were applied. In order to explore the possibilities of real time prediction, the Matlab distributing computing toolbox was used to accelerate the prediction. Finally, this study give an evaluation and validation of predict based on the ground truth collected. Validating results indicate these fire risk models have achieved nearly 70% accuracy of prediction and as well MODIS data are potential data source to implement near real-time fire risk prediction.

  12. Information Use Differences in Hot and Cold Risk Processing: When Does Information About Probability Count in the Columbia Card Task?

    PubMed Central

    Markiewicz, Łukasz; Kubińska, Elżbieta

    2015-01-01

    Objective: This paper aims to provide insight into information processing differences between hot and cold risk taking decision tasks within a single domain. Decision theory defines risky situations using at least three parameters: outcome one (often a gain) with its probability and outcome two (often a loss) with a complementary probability. Although a rational agent should consider all of the parameters, s/he could potentially narrow their focus to only some of them, particularly when explicit Type 2 processes do not have the resources to override implicit Type 1 processes. Here we investigate differences in risky situation parameters' influence on hot and cold decisions. Although previous studies show lower information use in hot than in cold processes, they do not provide decision weight changes and therefore do not explain whether this difference results from worse concentration on each parameter of a risky situation (probability, gain amount, and loss amount) or from ignoring some parameters. Methods: Two studies were conducted, with participants performing the Columbia Card Task (CCT) in either its Cold or Hot version. In the first study, participants also performed the Cognitive Reflection Test (CRT) to monitor their ability to override Type 1 processing cues (implicit processes) with Type 2 explicit processes. Because hypothesis testing required comparison of the relative importance of risky situation decision weights (gain, loss, probability), we developed a novel way of measuring information use in the CCT by employing a conjoint analysis methodology. Results: Across the two studies, results indicated that in the CCT Cold condition decision makers concentrate on each information type (gain, loss, probability), but in the CCT Hot condition they concentrate mostly on a single parameter: probability of gain/loss. We also show that an individual's CRT score correlates with information use propensity in cold but not hot tasks. Thus, the affective dimension of

  13. Estimated incidence and risk factors of sudden unexpected death

    PubMed Central

    Lin, Feng-Chang; Mehta, Neil; Mounsey, Louisa; Nwosu, Anthony; Pursell, Irion; Chung, Eugene H; Mounsey, J Paul; Simpson, Ross J

    2016-01-01

    Objective In this manuscript, we estimate the incidence and identify risk factors for sudden unexpected death in a socioeconomically and racially diverse population in one county in North Carolina. Estimates of the incidence and risk factors contributing to sudden death vary widely. The Sudden Unexpected Death in North Carolina (SUDDEN) project is a population-based investigation of the incidence and potential causes of sudden death. Methods From 3 March 2013 to 2 March 2014, all out-of-hospital deaths in Wake County, North Carolina, were screened to identify presumed sudden unexpected death among free-living residents between the ages of 18 and 64 years. Death certificate, public and medical records were reviewed and adjudicated to confirm sudden unexpected death cases. Results Following adjudication, 190 sudden unexpected deaths including 122 men and 68 women were identified. Estimated incidence was 32.1 per 100 000 person-years overall: 42.7 among men and 22.4 among women. The majority of victims were white, unmarried men over age 55 years, with unwitnessed deaths at home. Hypertension and dyslipidaemia were common in men and women. African-American women dying from sudden unexpected death were over-represented. Women who were under age 55 years with coronary disease accounted for over half of female participants with coronary artery disease. Conclusions The overall estimated incidence of sudden unexpected death may account for approximately 10% of all deaths classified as ‘natural’. Women have a lower estimated incidence of sudden unexpected death than men. However, we found no major differences in age or comorbidities between men and women. African-Americans and young women with coronary disease are at risk for sudden unexpected death. PMID:27042316

  14. Improved risk estimates for carbon tetrachloride. 1998 annual progress report

    SciTech Connect

    Benson, J.M.; Springer, D.L.; Thrall, K.D.

    1998-06-01

    'The overall purpose of these studies is to improve the scientific basis for assessing the cancer risk associated with human exposure to carbon tetrachloride. Specifically, the toxicokinetics of inhaled carbon tetrachloride is being determined in rats, mice and hamsters. Species differences in the metabolism of carbon tetrachloride by rats, mice and hamsters is being determined in vivo and in vitro using tissues and microsomes from these rodent species and man. Dose-response relationships will be determined in all studies. The information will be used to improve the current physiologically based pharmacokinetic model for carbon tetrachloride. The authors will also determine whether carbon tetrachloride is a hepatocarcinogen only when exposure results in cell damage, cell killing, and regenerative cell proliferation. In combination, the results of these studies will provide the types of information needed to enable a refined risk estimate for carbon tetrachloride under EPA''s new guidelines for cancer risk assessment.'

  15. A comparison of genetic risk score with family history for estimating prostate cancer risk

    PubMed Central

    Helfand, Brian T

    2016-01-01

    Prostate cancer (PCa) testing is recommended by most authoritative groups for high-risk men including those with a family history of the disease. However, family history information is often limited by patient knowledge and clinician intake, and thus, many men are incorrectly assigned to different risk groups. Alternate methods to assess PCa risk are required. In this review, we discuss how genetic variants, referred to as PCa-risk single-nucleotide polymorphisms, can be used to calculate a genetic risk score (GRS). GRS assigns a relatively unique value to all men based on the number of PCa-risk SNPs that an individual carries. This GRS value can provide a more precise estimate of a man's PCa risk. This is particularly relevant in situations when an individual is unaware of his family history. In addition, GRS has utility and can provide a more precise estimate of risk even among men with a positive family history. It can even distinguish risk among relatives with the same degree of family relationships. Taken together, this review serves to provide support for the clinical utility of GRS as an independent test to provide supplemental information to family history. As such, GRS can serve as a platform to help guide-shared decision-making processes regarding the timing and frequency of PCa testing and biopsies. PMID:27004541

  16. Risk estimation based on chromosomal aberrations induced by radiation

    NASA Technical Reports Server (NTRS)

    Durante, M.; Bonassi, S.; George, K.; Cucinotta, F. A.

    2001-01-01

    The presence of a causal association between the frequency of chromosomal aberrations in peripheral blood lymphocytes and the risk of cancer has been substantiated recently by epidemiological studies. Cytogenetic analyses of crew members of the Mir Space Station have shown that a significant increase in the frequency of chromosomal aberrations can be detected after flight, and that such an increase is likely to be attributed to the radiation exposure. The risk of cancer can be estimated directly from the yields of chromosomal aberrations, taking into account some aspects of individual susceptibility and other factors unrelated to radiation. However, the use of an appropriate technique for the collection and analysis of chromosomes and the choice of the structural aberrations to be measured are crucial in providing sound results. Based on the fraction of aberrant lymphocytes detected before and after flight, the relative risk after a long-term Mir mission is estimated to be about 1.2-1.3. The new technique of mFISH can provide useful insights into the quantification of risk on an individual basis.

  17. Effect of recent changes in atomic bomb survivor dosimetry on cancer mortality risk estimates.

    PubMed

    Preston, Dale L; Pierce, Donald A; Shimizu, Yukiko; Cullings, Harry M; Fujita, Shoichiro; Funamoto, Sachiyo; Kodama, Kazunori

    2004-10-01

    The Radiation Effects Research Foundation has recently implemented a new dosimetry system, DS02, to replace the previous system, DS86. This paper assesses the effect of the change on risk estimates for radiation-related solid cancer and leukemia mortality. The changes in dose estimates were smaller than many had anticipated, with the primary systematic change being an increase of about 10% in gamma-ray estimates for both cities. In particular, an anticipated large increase of the neutron component in Hiroshima for low-dose survivors did not materialize. However, DS02 improves on DS86 in many details, including the specifics of the radiation released by the bombs and the effects of shielding by structures and terrain. The data used here extend the last reported follow-up for solid cancers by 3 years, with a total of 10,085 deaths, and extends the follow-up for leukemia by 10 years, with a total of 296 deaths. For both solid cancer and leukemia, estimated age-time patterns and sex difference are virtually unchanged by the dosimetry revision. The estimates of solid-cancer radiation risk per sievert and the curvilinear dose response for leukemia are both decreased by about 8% by the dosimetry revision, due to the increase in the gamma-ray dose estimates. The apparent shape of the dose response is virtually unchanged by the dosimetry revision, but for solid cancers, the additional 3 years of follow-up has some effect. In particular, there is for the first time a statistically significant upward curvature for solid cancer on the restricted dose range 0-2 Sv. However, the low-dose slope of a linear-quadratic fit to that dose range should probably not be relied on for risk estimation, since that is substantially smaller than the linear slopes on ranges 0-1 Sv, 0-0.5 Sv, and 0- 0.25 Sv. Although it was anticipated that the new dosimetry system might reduce some apparent dose overestimates for Nagasaki factory workers, this did not materialize, and factory workers have

  18. Risk cross sections and their application to risk estimation in the galactic cosmic-ray environment

    NASA Technical Reports Server (NTRS)

    Curtis, S. B.; Nealy, J. E.; Wilson, J. W.; Chatterjee, A. (Principal Investigator)

    1995-01-01

    Radiation risk cross sections (i.e. risks per particle fluence) are discussed in the context of estimating the risk of radiation-induced cancer on long-term space flights from the galactic cosmic radiation outside the confines of the earth's magnetic field. Such quantities are useful for handling effects not seen after low-LET radiation. Since appropriate cross-section functions for cancer induction for each particle species are not yet available, the conventional quality factor is used as an approximation to obtain numerical results for risks of excess cancer mortality. Risks are obtained for seven of the most radiosensitive organs as determined by the ICRP [stomach, colon, lung, bone marrow (BFO), bladder, esophagus and breast], beneath 10 g/cm2 aluminum shielding at solar minimum. Spectra are obtained for excess relative risk for each cancer per LET interval by calculating the average fluence-LET spectrum for the organ and converting to risk by multiplying by a factor proportional to R gamma L Q(L) before integrating over L, the unrestricted LET. Here R gamma is the risk coefficient for low-LET radiation (excess relative mortality per Sv) for the particular organ in question. The total risks of excess cancer mortality obtained are 1.3 and 1.1% to female and male crew, respectively, for a 1-year exposure at solar minimum. Uncertainties in these values are estimated to range between factors of 4 and 15 and are dominated by the biological uncertainties in the risk coefficients for low-LET radiation and in the LET (or energy) dependence of the risk cross sections (as approximated by the quality factor). The direct substitution of appropriate risk cross sections will eventually circumvent entirely the need to calculate, measure or use absorbed dose, equivalent dose and quality factor for such a high-energy charged-particle environment.

  19. Risk Assessment Study of Fluoride Salts: Probability-Impact Matrix of Renal and Hepatic Toxicity Markers.

    PubMed

    Usuda, Kan; Ueno, Takaaki; Ito, Yuichi; Dote, Tomotaro; Yokoyama, Hirotaka; Kono, Koichi; Tamaki, Junko

    2016-09-01

    The present risk assessment study of fluoride salts was conducted by oral administration of three different doses of sodium and potassium fluorides (NaF, KF) and zinc fluoride tetrahydrate (ZnF2 •4H2O) to male Wistar rats. The rats were divided into control and nine experimental groups, to which oral injections of 0.5 mL distilled water and 0.5 mL of fluoride solutions, respectively, were given. The dosage of fluoride compounds was adjusted to contain 2.1 mg (low-dose group, LG), 4.3 mg (mid-dose group, MG), and 5.4 mg fluoride per 200 g rat body weight (high-dose group, HG) corresponding to 5, 10, and 12.5 % of LD50 values for NaF. The 24-h urine volume, N-acetyl-β-D-glucosaminidase (NAG) and creatinine clearance (Ccr) were measured as markers of possible acute renal impact. The levels of alanine aminotransferase (ALT) and aspartate aminotransferase (AST) were determined in serum samples as markers of acute hepatic impact. The levels of serum and urinary fluoride were determined to evaluate fluoride bioavailability. The results reveal that higher doses of NaF, KF, and ZnF2 induced renal damage as indicated by higher urinary NAG (p < 0.05 with ≥90th percentile of control). High doses of ZnF2 also induced a significant Ccr decrease (p < 0.05 with ≤10th percentile of control). Low doses of NaF and mid-doses of ZnF2 induced polyuria (p < 0.05 with ≥90th percentile of control) while medium doses of NaF and low doses of KF also induced liver damage, as indicated by a high level of AST (p < 0.05 with ≥90th percentile of control). These findings suggest that oral administration of fluoride is a potential, dose-dependent risk factor of renal tubular damage. PMID:26892107

  20. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  1. Risk estimation for fast neutrons with regard to solid cancer.

    PubMed

    Kellerer, A M; Walsh, L

    2001-12-01

    In the absence of epidemiological information on the effects of neutrons, their cancer mortality risk coefficient is currently taken as the product of two low-dose extrapolations: the nominal risk coefficient for photons and the presumed maximum relative biological effectiveness of neutrons. This approach is unnecessary. Since linearity in dose is assumed for neutrons at low to moderate effect levels, the risk coefficient can be derived in terms of the excess risk from epidemiological observations at an intermediate dose of gamma rays and an assumed value, R(1), of the neutron RBE relative to this reference dose of gamma rays. Application of this procedure to the A-bomb data requires accounting for the effect of the neutron dose component, which, according to the current dosimetry system, DS86, amounts on average to 11 mGy in the two cities at a total dose of 1 Gy. With R(1) tentatively set to 20 or 50, it is concluded that the neutrons have caused 18% or 35%, respectively, of the total effect at 1 Gy. The excess relative risk (ERR) for neutrons then lies between 8 per Gy and 16 per Gy. Translating these values into risk coefficients in terms of the effective dose, E, requires accounting for the gamma-ray component produced by the neutron field in the human body, which will require a separate analysis. The risk estimate for neutrons will remain essentially unaffected by the current reassessment of the neutron doses in Hiroshima, because the doses are unlikely to change much at the reference dose of 1 Gy. PMID:11741494

  2. Leukemia risk associated with benzene exposure in the pliofilm cohort. II. Risk estimates.

    PubMed

    Paxton, M B; Chinchilli, V M; Brett, S M; Rodricks, J V

    1994-04-01

    The detailed work histories of the individual workers composing the Pliofilm cohort represent a unique resource for estimating the dose-response for leukemia that may follow occupational exposure to benzene. In this paper, we report the results of analyzing the updated Pliofilm cohort using the proportional hazards model, a more sophisticated technique that uses more of the available exposure data than the conditional logistic model used by Rinsky et al. The more rigorously defined exposure estimates derived by Paustenbach et al. are consistent with those of Crump and Allen in giving estimates of the slope of the leukemogenic dose-response that are not as steep as the slope resulting from the exposure estimates of Rinsky et al. We consider estimates of 0.3-0.5 additional leukemia deaths per thousand workers with 45 ppm-years of cumulative benzene exposure to be the best estimates currently available of leukemia risk from occupational exposure to benzene. These risks were estimated in the proportional hazards model when the exposure estimates of Crump and Allen or of Paustenbach et al. were used to derive a cumulative concentration-by-time metric. PMID:8008924

  3. Cancer Risk Estimates from Space Flight Estimated Using Yields of Chromosome Damage in Astronaut's Blood Lymphocytes

    NASA Technical Reports Server (NTRS)

    George, Kerry A.; Rhone, J.; Chappell, L. J.; Cucinotta, F. A.

    2011-01-01

    To date, cytogenetic damage has been assessed in blood lymphocytes from more than 30 astronauts before and after they participated in long-duration space missions of three months or more on board the International Space Station. Chromosome damage was assessed using fluorescence in situ hybridization whole chromosome analysis techniques. For all individuals, the frequency of chromosome damage measured within a month of return from space was higher than their preflight yield, and biodosimetry estimates were within the range expected from physical dosimetry. Follow up analyses have been performed on most of the astronauts at intervals ranging from around 6 months to many years after flight, and the cytogenetic effects of repeat long-duration missions have so far been assessed in four individuals. Chromosomal aberrations in peripheral blood lymphocytes have been validated as biomarkers of cancer risk and cytogenetic damage can therefore be used to characterize excess health risk incurred by individual crewmembers after their respective missions. Traditional risk assessment models are based on epidemiological data obtained on Earth in cohorts exposed predominantly to acute doses of gamma-rays, and the extrapolation to the space environment is highly problematic, involving very large uncertainties. Cytogenetic damage could play a key role in reducing uncertainty in risk estimation because it is incurred directly in the space environment, using specimens from the astronauts themselves. Relative cancer risks were estimated from the biodosimetry data using the quantitative approach derived from the European Study Group on Cytogenetic Biomarkers and Health database. Astronauts were categorized into low, medium, or high tertiles according to their yield of chromosome damage. Age adjusted tertile rankings were used to estimate cancer risk and results were compared with values obtained using traditional modeling approaches. Individual tertile rankings increased after space

  4. Spatio-temporal population estimates for risk management

    NASA Astrophysics Data System (ADS)

    Cockings, Samantha; Martin, David; Smith, Alan; Martin, Rebecca

    2013-04-01

    Accurate estimation of population at risk from hazards and effective emergency management of events require not just appropriate spatio-temporal modelling of hazards but also of population. While much recent effort has been focused on improving the modelling and predictions of hazards (both natural and anthropogenic), there has been little parallel advance in the measurement or modelling of population statistics. Different hazard types occur over diverse temporal cycles, are of varying duration and differ significantly in their spatial extent. Even events of the same hazard type, such as flood events, vary markedly in their spatial and temporal characteristics. Conceptually and pragmatically then, population estimates should also be available for similarly varying spatio-temporal scales. Routine population statistics derived from traditional censuses or surveys are usually static representations in both space and time, recording people at their place of usual residence on census/survey night and presenting data for administratively defined areas. Such representations effectively fix the scale of population estimates in both space and time, which is unhelpful for meaningful risk management. Over recent years, the Pop24/7 programme of research, based at the University of Southampton (UK), has developed a framework for spatio-temporal modelling of population, based on gridded population surfaces. Based on a data model which is fully flexible in terms of space and time, the framework allows population estimates to be produced for any time slice relevant to the data contained in the model. It is based around a set of origin and destination centroids, which have capacities, spatial extents and catchment areas, all of which can vary temporally, such as by time of day, day of week, season. A background layer, containing information on features such as transport networks and landuse, provides information on the likelihood of people being in certain places at specific times

  5. Estimating Worker Risk Levels Using Accident/Incident Data

    SciTech Connect

    Kenoyer, Judson L.; Stenner, Robert D.; Andrews, William B.; Scherpelz, Robert I.; Aaberg, Rosanne L.

    2000-09-26

    The purpose of the work described in this report was to identify methods that are currently being used in the Department of Energy (DOE) complex to identify and control hazards/risks in the workplace, evaluate them in terms of their effectiveness in reducing risk to the workers, and to develop a preliminary method that could be used to predict the relative risks to workers performing proposed tasks using some of the current methodology. This report describes some of the performance indicators (i.e., safety metrics) that are currently being used to track relative levels of workplace safety in the DOE complex, how these fit into an Integrated Safety Management (ISM) system, some strengths and weaknesses of using a statistically based set of indicators, and methods to evaluate them. Also discussed are methods used to reduce risk to the workers and some of the techniques that appear to be working in the process of establishing a condition of continuous improvement. The results of these methods will be used in future work involved with the determination of modifying factors for a more complex model. The preliminary method to predict the relative risk level to workers during an extended future time period is based on a currently used performance indicator that uses several factors tracked in the CAIRS. The relative risks for workers in a sample (but real) facility on the Hanford site are estimated for a time period of twenty years and are based on workforce predictions. This is the first step in developing a more complex model that will incorporate other modifying factors related to the workers, work environment and status of the ISM system to adjust the preliminary prediction.

  6. Time-Dependent Risk Estimation and Cost-Benefit Analysis for Mitigation Actions

    NASA Astrophysics Data System (ADS)

    van Stiphout, T.; Wiemer, S.; Marzocchi, W.

    2009-04-01

    Earthquakes strongly cluster in space and time. Consequently, the most dangerous time is right after a moderate earthquake has happened, because their is a ‘high' (i.e., 2-5 percent) probability that this event will be followed by a subsequent aftershock which happens to be as large or larger than the initiating event. The seismic hazard during this time-period exceeds the background probability significantly and by several orders of magnitude. Scientists have developed increasingly accurate forecast models that model this time-dependent hazard, and such models are currently being validated in prospective testing. However, this probabilistic information in the hazard space is difficult to digest for decision makers, the media and general public. Here, we introduce a possible bridge between seismology and decision makers (authorities, civil defense) by proposing a more objective way to estimate time-dependent risk assessment. Short Term Earthquake Risk assessment (STEER) combines aftershock hazard and loss assessments. We use site-specific information on site effects and building class distribution and combine this with existing loss models to compute site specific time-dependent risk curves (probability of exceedance for fatalities, injuries, damages etc). We show the effect of uncertainties in the different components using Monte Carlo Simulations of the input parameters. This time-dependent risk curves can act as a decision support. We extend the STEER approach by introducing a Cost-Benefit approach for certain mitigation actions after a medium-sized earthquake. Such Cost-Benefit approaches have been recently developed for volcanic risk assessment to rationalize precautionary evacuations in densely inhabitated areas threatened by volcanoes. Here we extend the concept to time-dependent probabilistic seismic risk assessment. For the Cost-Benefit analysis of mitigation actions we calculate the ratio between the cost for the mitigation actions and the cost of the

  7. Impaired risk evaluation in people with Internet gaming disorder: fMRI evidence from a probability discounting task.

    PubMed

    Lin, Xiao; Zhou, Hongli; Dong, Guangheng; Du, Xiaoxia

    2015-01-01

    This study examined how Internet gaming disorder (IGD) subjects modulating reward and risk at a neural level under a probability-discounting task with functional magnetic resonance imaging (fMRI). Behavioral and imaging data were collected from 19 IGD subjects (22.2 ± 3.08 years) and 21 healthy controls (HC, 22.8 ± 3.5 years). Behavior results showed that IGD subjects prefer the probabilistic options to fixed ones and were associated with shorter reaction time, when comparing to HC. The fMRI results revealed that IGD subjects show decreased activation in the inferior frontal gyrus and the precentral gyrus when choosing the probabilistic options than HC. Correlations were also calculated between behavioral performances and brain activities in relevant brain regions. Both of the behavioral performance and fMRI results indicate that people with IGD show impaired risk evaluation, which might be the reason why IGD subjects continue playing online games despite the risks of widely known negative consequence. PMID:25218095

  8. Stochastic approach for an unbiased estimation of the probability of a successful separation in conventional chromatography and sequential elution liquid chromatography.

    PubMed

    Ennis, Erin J; Foley, Joe P

    2016-07-15

    A stochastic approach was utilized to estimate the probability of a successful isocratic or gradient separation in conventional chromatography for numbers of sample components, peak capacities, and saturation factors ranging from 2 to 30, 20-300, and 0.017-1, respectively. The stochastic probabilities were obtained under conditions of (i) constant peak width ("gradient" conditions) and (ii) peak width increasing linearly with time ("isocratic/constant N" conditions). The isocratic and gradient probabilities obtained stochastically were compared with the probabilities predicted by Martin et al. [Anal. Chem., 58 (1986) 2200-2207] and Davis and Stoll [J. Chromatogr. A, (2014) 128-142]; for a given number of components and peak capacity the same trend is always observed: probability obtained with the isocratic stochastic approach<probability obtained with the gradient stochastic approach≤probability predicted by Davis and Stoll < probability predicted by Martin et al. The differences are explained by the positive bias of the Martin equation and the lower average resolution observed for the isocratic simulations compared to the gradient simulations with the same peak capacity. When the stochastic results are applied to conventional HPLC and sequential elution liquid chromatography (SE-LC), the latter is shown to provide much greater probabilities of success for moderately complex samples (e.g., PHPLC=31.2% versus PSE-LC=69.1% for 12 components and the same analysis time). For a given number of components, the density of probability data provided over the range of peak capacities is sufficient to allow accurate interpolation of probabilities for peak capacities not reported, <1.5% error for saturation factors <0.20. Additional applications for the stochastic approach include isothermal and programmed-temperature gas chromatography. PMID:27286646

  9. Global Building Inventory for Earthquake Loss Estimation and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  10. Estimation of risks associated with paediatric cochlear implantation.

    PubMed

    Johnston, J Cyne; Smith, Andrée Durieux; Fitzpatrick, Elizabeth; O'Connor, Annette; Angus, Douglas; Benzies, Karen; Schramm, David

    2010-09-01

    The objectives of this study were to estimate the rates of complications associated with paediatric cochlear implantation use: a) at one Canadian cochlear implant (CI) centre, and b) in the published literature. It comprised a retrospective hospital-based chart review and a concurrent review of complications in the published literature. There were 224 children who had undergone surgery from 1994 to June 2007. Results indicate that the rates of complications at the local Canadian paediatric CI centre are not significantly different from the literature rates for all examined complication types. This hospital-based retrospective chart review and review of the literature provide readers with an estimation of the risks to aid in evidence-based decision-making surrounding paediatric cochlear implantation. PMID:19655302

  11. Cryptosporidium and Giardia in tropical recreational marine waters contaminated with domestic sewage: estimation of bathing-associated disease risks.

    PubMed

    Betancourt, Walter Q; Duarte, Diana C; Vásquez, Rosa C; Gurian, Patrick L

    2014-08-15

    Sewage is a major contributor to pollution problems involving human pathogens in tropical coastal areas. This study investigated the occurrence of intestinal protozoan parasites (Giardia and Cryptosporidium) in tropical recreational marine waters contaminated with sewage. The potential risks of Cryptosporidium and Giardia infection from recreational water exposure were estimated from the levels of viable (oo) cysts (DIC+, DAPI+, PI-) found in near-shore swimming areas using an exponential dose response model. A Monte Carlo uncertainty analysis was performed in order to determine the probability distribution of risks. Microbial indicators of recreational water quality (enterococci, Clostridium perfringens) and genetic markers of sewage pollution (human-specific Bacteroidales marker [HF183] and Clostridium coccoides) were simultaneously evaluated in order to estimate the extent of water quality deterioration associated with human wastes. The study revealed the potential risk of parasite infections via primary contact with tropical marine waters contaminated with sewage; higher risk estimates for Giardia than for Cryptosporidium were found. Mean risks estimated by Monte Carlo were below the U.S. EPA upper bound on recreational risk of 0.036 for cryptosporidiosis and giardiasis for both children and adults. However, 95th percentile estimates for giardiasis for children exceeded the 0.036 level. Environmental surveillance of microbial pathogens is crucial in order to control and eradicate the effects that increasing anthropogenic impacts have on marine ecosystems and human health. PMID:24975093

  12. A Method to Estimate the Probability That Any Individual Lightning Stroke Contacted the Surface Within Any Radius of Any Point

    NASA Technical Reports Server (NTRS)

    Huddleston, Lisa L.; Roeder, William; Merceret, Francis J.

    2010-01-01

    A technique has been developed to calculate the probability that any nearby lightning stroke is within any radius of any point of interest. In practice, this provides the probability that a nearby lightning stroke was within a key distance of a facility, rather than the error ellipses centered on the stroke. This process takes the current bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to get the probability that the stroke is inside any specified radius. This new facility-centric technique will be much more useful to the space launch customers and may supersede the lightning error ellipse approach discussed in [5], [6].

  13. Health risks in wastewater irrigation: comparing estimates from quantitative microbial risk analyses and epidemiological studies.

    PubMed

    Mara, D D; Sleigh, P A; Blumenthal, U J; Carr, R M

    2007-03-01

    The combination of standard quantitative microbial risk analysis (QMRA) techniques and 10,000-trial Monte Carlo risk simulations was used to estimate the human health risks associated with the use of wastewater for unrestricted and restricted crop irrigation. A risk of rotavirus infection of 10(-2) per person per year (pppy) was used as the reference level of acceptable risk. Using the model scenario of involuntary soil ingestion for restricted irrigation, the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or =10(6) Escherichia coli per 100ml and when local agricultural practices are highly mechanised. For labour-intensive agriculture the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or = 10(5) E. coli per 100ml; however, the wastewater quality should be < or = 10(4) E. coli per 100ml when children under 15 are exposed. With the model scenario of lettuce consumption for unrestricted irrigation, the use of wastewaters containing < or =10(4) E. coli per 100ml results in a rotavirus infection risk of approximately 10(-2) pppy; however, again based on epidemiological evidence from Mexico, the current WHO guideline level of < or =1,000 E. coli per 100ml should be retained for root crops eaten raw. PMID:17402278

  14. Estimate of the risks of disposing nonhazardous oil field wastes into salt caverns

    SciTech Connect

    Tomasko, D.; Elcock, D.; Veil, J.

    1997-12-31

    Argonne National Laboratory (ANL) has completed an evaluation of the possibility that adverse human health effects (carcinogenic and noncarcinogenic) could result from exposure to contaminants released from nonhazardous oil field wastes (NOW) disposed in domal salt caverns. Potential human health risks associated with hazardous substances (arsenic, benzene, cadmium, and chromium) in NOW were assessed under four postclosure cavern release scenarios: inadvertent cavern intrusion, failure of the cavern seal, failure of the cavern through cracks or leaky interbeds, and a partial collapse of the cavern roof. To estimate potential human health risks for these scenarios, contaminant concentrations at the receptor were calculated using a one-dimensional solution to an advection/dispersion equation that included first order degradation. Assuming a single, generic salt cavern and generic oil-field wastes, the best-estimate excess cancer risks ranged from 1.7 {times} 10{sup {minus}12} to 1.1 {times} 10{sup {minus}8} and hazard indices (referring to noncancer health effects) ranged from 7 {times} 10{sup {minus}9} to 7 {times} 10{sup {minus}4}. Under worse-case conditions in which the probability of cavern failure is 1.0, excess cancer risks ranged from 4.9 {times} 10{sup {minus}9} to 1.7 {times} 10{sup {minus}5} and hazard indices ranged from 7.0 {times} 10{sup {minus}4} to 0.07. Even under worst-case conditions, the risks are within the US Environmental Protection Agency (EPA) target range for acceptable exposure levels. From a human health risk perspective, salt caverns can, therefore, provide an acceptable disposal method for NOW.

  15. Common Cause Case Study: An Estimated Probability of Four Solid Rocket Booster Hold-Down Post Stud Hang-ups

    NASA Technical Reports Server (NTRS)

    Cross, Robert

    2005-01-01

    Until Solid Rocket Motor ignition, the Space Shuttle is mated to the Mobil Launch Platform in part via eight (8) Solid Rocket Booster (SRB) hold-down bolts. The bolts are fractured using redundant pyrotechnics, and are designed to drop through a hold-down post on the Mobile Launch Platform before the Space Shuttle begins movement. The Space Shuttle program has experienced numerous failures where a bolt has hung up. That is, it did not clear the hold-down post before liftoff and was caught by the SRBs. This places an additional structural load on the vehicle that was not included in the original certification requirements. The Space Shuttle is currently being certified to withstand the loads induced by up to three (3) of eight (8) SRB hold-down experiencing a "hang-up". The results of loads analyses performed for (4) stud hang-ups indicate that the internal vehicle loads exceed current structural certification limits at several locations. To determine the risk to the vehicle from four (4) stud hang-ups, the likelihood of the scenario occurring must first be evaluated. Prior to the analysis discussed in this paper, the likelihood of occurrence had been estimated assuming that the stud hang-ups were completely independent events. That is, it was assumed that no common causes or factors existed between the individual stud hang-up events. A review of the data associated with the hang-up events, showed that a common factor (timing skew) was present. This paper summarizes a revised likelihood evaluation performed for the four (4) stud hang-ups case considering that there are common factors associated with the stud hang-ups. The results show that explicitly (i.e. not using standard common cause methodologies such as beta factor or Multiple Greek Letter modeling) taking into account the common factor of timing skew results in an increase in the estimated likelihood of four (4) stud hang-ups of an order of magnitude over the independent failure case.

  16. Common Cause Case Study: An Estimated Probability of Four Solid Rocket Booster Hold-down Post Stud Hang-ups

    NASA Technical Reports Server (NTRS)

    Cross, Robert

    2005-01-01

    Until Solid Rocket Motor ignition, the Space Shuttle is mated to the Mobil Launch Platform in part via eight (8) Solid Rocket Booster (SRB) hold-down bolts. The bolts are fractured using redundant pyrotechnics, and are designed to drop through a hold-down post on the Mobile Launch Platform before the Space Shuttle begins movement. The Space Shuttle program has experienced numerous failures where a bolt has "hung-up." That is, it did not clear the hold-down post before liftoff and was caught by the SRBs. This places an additional structural load on the vehicle that was not included in the original certification requirements. The Space Shuttle is currently being certified to withstand the loads induced by up to three (3) of eight (8) SRB hold-down post studs experiencing a "hang-up." The results af loads analyses performed for four (4) stud-hang ups indicate that the internal vehicle loads exceed current structural certification limits at several locations. To determine the risk to the vehicle from four (4) stud hang-ups, the likelihood of the scenario occurring must first be evaluated. Prior to the analysis discussed in this paper, the likelihood of occurrence had been estimated assuming that the stud hang-ups were completely independent events. That is, it was assumed that no common causes or factors existed between the individual stud hang-up events. A review of the data associated with the hang-up events, showed that a common factor (timing skew) was present. This paper summarizes a revised likelihood evaluation performed for the four (4) stud hang-ups case considering that there are common factors associated with the stud hang-ups. The results show that explicitly (i.e. not using standard common cause methodologies such as beta factor or Multiple Greek Letter modeling) taking into account the common factor of timing skew results in an increase in the estimated likelihood of four (4) stud hang-ups of an order of magnitude over the independent failure case.

  17. Hate Crimes and Stigma-Related Experiences among Sexual Minority Adults in the United States: Prevalence Estimates from a National Probability Sample

    ERIC Educational Resources Information Center

    Herek, Gregory M.

    2009-01-01

    Using survey responses collected via the Internet from a U.S. national probability sample of gay, lesbian, and bisexual adults (N = 662), this article reports prevalence estimates of criminal victimization and related experiences based on the target's sexual orientation. Approximately 20% of respondents reported having experienced a person or…

  18. Moving towards best practice when using inverse probability of treatment weighting (IPTW) using the propensity score to estimate causal treatment effects in observational studies.

    PubMed

    Austin, Peter C; Stuart, Elizabeth A

    2015-12-10

    The propensity score is defined as a subject's probability of treatment selection, conditional on observed baseline covariates. Weighting subjects by the inverse probability of treatment received creates a synthetic sample in which treatment assignment is independent of measured baseline covariates. Inverse probability of treatment weighting (IPTW) using the propensity score allows one to obtain unbiased estimates of average treatment effects. However, these estimates are only valid if there are no residual systematic differences in observed baseline characteristics between treated and control subjects in the sample weighted by the estimated inverse probability of treatment. We report on a systematic literature review, in which we found that the use of IPTW has increased rapidly in recent years, but that in the most recent year, a majority of studies did not formally examine whether weighting balanced measured covariates between treatment groups. We then proceed to describe a suite of quantitative and qualitative methods that allow one to assess whether measured baseline covariates are balanced between treatment groups in the weighted sample. The quantitative methods use the weighted standardized difference to compare means, prevalences, higher-order moments, and interactions. The qualitative methods employ graphical methods to compare the distribution of continuous baseline covariates between treated and control subjects in the weighted sample. Finally, we illustrate the application of these methods in an empirical case study. We propose a formal set of balance diagnostics that contribute towards an evolving concept of 'best practice' when using IPTW to estimate causal treatment effects using observational data. PMID:26238958

  19. Time-to-Compromise Model for Cyber Risk Reduction Estimation

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2005-09-01

    We propose a new model for estimating the time to compromise a system component that is visible to an attacker. The model provides an estimate of the expected value of the time-to-compromise as a function of known and visible vulnerabilities, and attacker skill level. The time-to-compromise random process model is a composite of three subprocesses associated with attacker actions aimed at the exploitation of vulnerabilities. In a case study, the model was used to aid in a risk reduction estimate between a baseline Supervisory Control and Data Acquisition (SCADA) system and the baseline system enhanced through a specific set of control system security remedial actions. For our case study, the total number of system vulnerabilities was reduced by 86% but the dominant attack path was through a component where the number of vulnerabilities was reduced by only 42% and the time-to-compromise of that component was increased by only 13% to 30% depending on attacker skill level.

  20. Self-injury, substance use, and associated risk factors in a multi-campus probability sample of college students.

    PubMed

    Serras, Alisha; Saules, Karen K; Cranford, James A; Eisenberg, Daniel

    2010-03-01

    This research examined two questions: (1) What is the prevalence of self-injurious behavior (SIB) among college students, overall and by gender, academic level, and sexual orientation? (2) To what extent is SIB associated with different forms of substance use and other risk behaviors? A probability sample of 5,689 students completed an Internet survey on self-injury, mental health, and substance use. Past-year prevalence of SIB was 14.3%, with undergraduates significantly more likely than graduate students to engage in SIB. Drug use and frequent binge drinking were associated with higher rates of SIB. Among those who engaged in any SIB, those who used drugs had higher depression scores, higher prevalence of cigarette smoking, and higher rates of binge eating. In a multiple logistic regression model predicting SIB, depression, cigarette smoking, gambling, and drug use were significant predictors. Information about those at risk for SIB is critical for the design of prevention and intervention efforts as colleges continue to grapple with risky behaviors. PMID:20307119

  1. Estimates of mean consequences and confidence bounds on the mean associated with low-probability seismic events in total system performance assessments

    SciTech Connect

    Pensado, Osvaldo; Mancillas, James

    2007-07-01

    An approach is described to estimate mean consequences and confidence bounds on the mean of seismic events with low probability of breaching components of the engineered barrier system. The approach is aimed at complementing total system performance assessment models used to understand consequences of scenarios leading to radionuclide releases in geologic nuclear waste repository systems. The objective is to develop an efficient approach to estimate mean consequences associated with seismic events of low probability, employing data from a performance assessment model with a modest number of Monte Carlo realizations. The derived equations and formulas were tested with results from a specific performance assessment model. The derived equations appear to be one method to estimate mean consequences without having to use a large number of realizations. (authors)

  2. Moving towards a new paradigm for global flood risk estimation

    NASA Astrophysics Data System (ADS)

    Troy, Tara J.; Devineni, Naresh; Lima, Carlos; Lall, Upmanu

    2013-04-01

    model is implemented at a finer resolution (<=1km) in order to more accurately model streamflow under flood conditions and estimate inundation. This approach allows for efficient computational simulation of the hydrology when not under potential for flooding with high-resolution flood wave modeling when there is flooding potential. We demonstrate the results of this flood risk estimation system for the Ohio River basin in the United States, a large river basin that is historically prone to flooding, with the intention of using it to do global flood risk assessment.

  3. A comparison of conventional capture versus PIT reader techniques for estimating survival and capture probabilities of big brown bats (Eptesicus fuscus)

    USGS Publications Warehouse

    Ellison, L.E.; O'Shea, T.J.; Neubaum, D.J.; Neubaum, M.A.; Pearce, R.D.; Bowen, R.A.

    2007-01-01

    We compared conventional capture (primarily mist nets and harp traps) and passive integrated transponder (PIT) tagging techniques for estimating capture and survival probabilities of big brown bats (Eptesicus fuscus) roosting in buildings in Fort Collins, Colorado. A total of 987 female adult and juvenile bats were captured and marked by subdermal injection of PIT tags during the summers of 2001-2005 at five maternity colonies in buildings. Openings to roosts were equipped with PIT hoop-style readers, and exit and entry of bats were passively monitored on a daily basis throughout the summers of 2002-2005. PIT readers 'recaptured' adult and juvenile females more often than conventional capture events at each roost. Estimates of annual capture probabilities for all five colonies were on average twice as high when estimated from PIT reader data (P?? = 0.93-1.00) than when derived from conventional techniques (P?? = 0.26-0.66), and as a consequence annual survival estimates were more precisely estimated when using PIT reader encounters. Short-term, daily capture estimates were also higher using PIT readers than conventional captures. We discuss the advantages and limitations of using PIT tags and passive encounters with hoop readers vs. conventional capture techniques for estimating these vital parameters in big brown bats. ?? Museum and Institute of Zoology PAS.

  4. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Safety and Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction...

  5. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Safety and Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction...

  6. Estimating the Pollution Risk of Cadmium in Soil Using a Composite Soil Environmental Quality Standard

    PubMed Central

    Huang, Biao; Zhao, Yongcun

    2014-01-01

    Estimating standard-exceeding probabilities of toxic metals in soil is crucial for environmental evaluation. Because soil pH and land use types have strong effects on the bioavailability of trace metals in soil, they were taken into account by some environmental protection agencies in making composite soil environmental quality standards (SEQSs) that contain multiple metal thresholds under different pH and land use conditions. This study proposed a method for estimating the standard-exceeding probability map of soil cadmium using a composite SEQS. The spatial variability and uncertainty of soil pH and site-specific land use type were incorporated through simulated realizations by sequential Gaussian simulation. A case study was conducted using a sample data set from a 150 km2 area in Wuhan City and the composite SEQS for cadmium, recently set by the State Environmental Protection Administration of China. The method may be useful for evaluating the pollution risks of trace metals in soil with composite SEQSs. PMID:24672364

  7. Normal Tissue Complication Probability Estimation by the Lyman-Kutcher-Burman Method Does Not Accurately Predict Spinal Cord Tolerance to Stereotactic Radiosurgery

    SciTech Connect

    Daly, Megan E.; Luxton, Gary; Choi, Clara Y.H.; Gibbs, Iris C.; Chang, Steven D.; Adler, John R.; Soltys, Scott G.

    2012-04-01

    Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear-quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18-30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8-30.9 Gy) and 22.0 Gy (range, 20.2-26.6 Gy), respectively. By use of conventional values for {alpha}/{beta}, volume parameter n, 50% complication probability dose TD{sub 50}, and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of {alpha}/{beta} and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of {alpha}/{beta} and n yielded better predictions (0.7 complications), with n = 0.023 and {alpha}/{beta} = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high {alpha}/{beta} value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models

  8. Declining bioavailability and inappropriate estimation of risk of persistent compounds

    SciTech Connect

    Kelsey, J.W.; Alexander, M.

    1997-03-01

    Earthworms (Eisenia foetida) assimilated decreasing amounts of atrazine, phenanthrene, and naphthalene that had been incubated for increasing periods of time in sterile soil. The amount of atrazine and phenanthrene removed from soil by mild extractants also decreased with time. The declines in bioavailability of the three compounds to earthworms and of naphthalene to bacteria were not reflected by analysis involving vigorous methods of solvent extraction; similar results for bioavailability of phenanthrene and 4-nitrophenol to bacteria were obtained in a previous study conducted at this laboratory. The authors suggest that regulations based on vigorous extractions for the analyses of persistent organic pollutants in soil do not appropriately estimate exposure or risk to susceptible populations.

  9. Estimation of Tsunami Risk for the Caribbean Coast

    NASA Astrophysics Data System (ADS)

    Zahibo, N.

    2004-05-01

    The tsunami problem for the coast of the Caribbean basin is discussed. Briefly the historical data of tsunami in the Caribbean Sea are presented. Numerical simulation of potential tsunamis in the Caribbean Sea is performed in the framework of the nonlinear-shallow theory. The tsunami wave height distribution along the Caribbean Coast is computed. These results are used to estimate the far-field tsunami potential of various coastal locations in the Caribbean Sea. In fact, five zones with tsunami low risk are selected basing on prognostic computations, they are: the bay "Golfo de Batabano" and the coast of province "Ciego de Avila" in Cuba, the Nicaraguan Coast (between Bluefields and Puerto Cabezas), the border between Mexico and Belize, the bay "Golfo de Venezuela" in Venezuela. The analysis of historical data confirms that there was no tsunami in the selected zones. Also, the wave attenuation in the Caribbean Sea is investigated; in fact, wave amplitude decreases in an order if the tsunami source is located on the distance up to 1000 km from the coastal location. Both factors wave attenuation and wave height distribution should be taken into account in the planned warning system for the Caribbean Sea. Specially the problem of tsunami risk for Lesser Antilles including Guadeloupe is discussed.

  10. How to Estimate Epidemic Risk from Incomplete Contact Diaries Data?

    PubMed

    Mastrandrea, Rossana; Barrat, Alain

    2016-06-01

    Social interactions shape the patterns of spreading processes in a population. Techniques such as diaries or proximity sensors allow to collect data about encounters and to build networks of contacts between individuals. The contact networks obtained from these different techniques are however quantitatively different. Here, we first show how these discrepancies affect the prediction of the epidemic risk when these data are fed to numerical models of epidemic spread: low participation rate, under-reporting of contacts and overestimation of contact durations in contact diaries with respect to sensor data determine indeed important differences in the outcomes of the corresponding simulations with for instance an enhanced sensitivity to initial conditions. Most importantly, we investigate if and how information gathered from contact diaries can be used in such simulations in order to yield an accurate description of the epidemic risk, assuming that data from sensors represent the ground truth. The contact networks built from contact sensors and diaries present indeed several structural similarities: this suggests the possibility to construct, using only the contact diary network information, a surrogate contact network such that simulations using this surrogate network give the same estimation of the epidemic risk as simulations using the contact sensor network. We present and compare several methods to build such surrogate data, and show that it is indeed possible to obtain a good agreement between the outcomes of simulations using surrogate and sensor data, as long as the contact diary information is complemented by publicly available data describing the heterogeneity of the durations of human contacts. PMID:27341027

  11. How to Estimate Epidemic Risk from Incomplete Contact Diaries Data?

    PubMed Central

    Mastrandrea, Rossana; Barrat, Alain

    2016-01-01

    Social interactions shape the patterns of spreading processes in a population. Techniques such as diaries or proximity sensors allow to collect data about encounters and to build networks of contacts between individuals. The contact networks obtained from these different techniques are however quantitatively different. Here, we first show how these discrepancies affect the prediction of the epidemic risk when these data are fed to numerical models of epidemic spread: low participation rate, under-reporting of contacts and overestimation of contact durations in contact diaries with respect to sensor data determine indeed important differences in the outcomes of the corresponding simulations with for instance an enhanced sensitivity to initial conditions. Most importantly, we investigate if and how information gathered from contact diaries can be used in such simulations in order to yield an accurate description of the epidemic risk, assuming that data from sensors represent the ground truth. The contact networks built from contact sensors and diaries present indeed several structural similarities: this suggests the possibility to construct, using only the contact diary network information, a surrogate contact network such that simulations using this surrogate network give the same estimation of the epidemic risk as simulations using the contact sensor network. We present and compare several methods to build such surrogate data, and show that it is indeed possible to obtain a good agreement between the outcomes of simulations using surrogate and sensor data, as long as the contact diary information is complemented by publicly available data describing the heterogeneity of the durations of human contacts. PMID:27341027

  12. Gambling disorder: estimated prevalence rates and risk factors in Macao.

    PubMed

    Wu, Anise M S; Lai, Mark H C; Tong, Kwok-Kit

    2014-12-01

    An excessive, problematic gambling pattern has been regarded as a mental disorder in the Diagnostic and Statistical Manual for Mental Disorders (DSM) for more than 3 decades (American Psychiatric Association [APA], 1980). In this study, its latest prevalence in Macao (one of very few cities with legalized gambling in China and the Far East) was estimated with 2 major changes in the diagnostic criteria, suggested by the 5th edition of DSM (APA, 2013): (a) removing the "Illegal Act" criterion, and (b) lowering the threshold for diagnosis. A random, representative sample of 1,018 Macao residents was surveyed with a phone poll design in January 2013. After the 2 changes were adopted, the present study showed that the estimated prevalence rate of gambling disorder was 2.1% of the Macao adult population. Moreover, the present findings also provided empirical support to the application of these 2 recommended changes when assessing symptoms of gambling disorder among Chinese community adults. Personal risk factors of gambling disorder, namely being male, having low education, a preference for casino gambling, as well as high materialism, were identified. PMID:25134026

  13. Estimating the Risk of Renal Stone Events During Long-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Reyes, David; Kerstman, Eric; Locke, James

    2014-01-01

    Introduction: Given the bone loss and increased urinary calcium excretion in the microgravity environment, persons participating in long-duration spaceflight may have an increased risk for renal stone formation. Renal stones are often an incidental finding of abdominal imaging studies done for other reasons. Thus, some crewmembers may have undiscovered, asymptomatic stones prior to their mission. Methods: An extensive literature search was conducted concerning the natural history of asymptomatic renal stones. For comparison, simulations were done using the Integrated Medical Model (IMM). The IMM is an evidence-based decision support tool that provides risk analysis and has the capability to optimize medical systems for missions by minimizing the occurrence of adverse mission outcomes such as evacuation and loss of crew life within specified mass and volume constraints. Results: The literature of the natural history of asymptomatic renal stones in the general medical population shows that the probability of symptomatic event is 8% to 34% at 1 to 3 years for stones < 7 mm. Extrapolated to a 6-month mission, for stones < 5 to 7 mm, the risk for any stone event is about 4 to 6%, with a 0.7% to 4% risk for intervention, respectively. IMM simulations compare favorably with risk estimates garnered from the terrestrial literature. The IMM forecasts that symptomatic renal stones may be one of the top drivers for medical evacuation of an International Space Station (ISS) mission. Discussion: Although the likelihood of a stone event is low, the consequences could be severe due to limitations of current ISS medical capabilities. Therefore, these risks need to be quantified to aid planning, limit crew morbidity and mitigate mission impacts. This will be especially critical for missions beyond earth orbit, where evacuation may not be an option.

  14. Semi-analytical estimation of wellbore leakage risk during CO2 sequestration in Ottawa County, Michigan

    NASA Astrophysics Data System (ADS)

    Guo, B.; Matteo, E. N.; Elliot, T. R.; Nogues, J. P.; Deng, H.; Fitts, J. P.; Pollak, M.; Bielicki, J.; Wilson, E.; Celia, M. A.; Peters, C. A.

    2011-12-01

    Using the semi-analytical ELSA model, wellbore leakage risk is estimated for CO2 injection into either the Mt. Simon or St. Peter formations, which are part of the Michigan Sedimentary Basin that lies beneath Ottawa County, MI. ELSA is a vertically integrated subsurface modeling tool that can be used to simulate both supercritical CO2 plume distribution/migration and pressure- induced brine displacement during CO2 injection. A composite 3D subsurface domain was constructed for the ELSA simulations based on estimated permeabilities for formation layers, as well as GIS databases containing subsurface stratigraphy, active and inactive and inactive wells, and potential interactions with subsurface activities. These activities include potable aquifers, oil and gas reservoirs, and waste injection sites, which represent potential liabilities if encountered by brine or supercritical CO2 displaced from the injection formation. Overall, the 3D subsurface domain encompasses an area of 1500 km2 to a depth of 2 km and contains over 3,000 wells. The permeabilities for abandoned wells are derived from a ranking system based on available well data including historical records and well logs. This distribution is then randomly sampled in Monte Carlo simulations that are used to generate a probability map for subsurface interferences or atmospheric release resulting from leakage of CO2 and /or brine from the injection formation. This method serves as the basis for comparative testing between various scenarios for injection, as well as for comparing the relative risk of leakage between injection formations or storage sites.

  15. Double-ended break probability estimate for the 304 stainless steel main circulation piping of a production reactor

    SciTech Connect

    Mehta, H.S.; Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.

    1991-12-31

    The large break frequency resulting from intergranular stress corrosion cracking in the main circulation piping of the Savannah River Site (SRS) production reactors has been estimated. Four factors are developed to describe the likelihood that a crack exists that is not identified by ultrasonic inspection, and that grows to instability prior to growing through-wall and being detected by the ensuing leakage. The estimated large break frequency is 3.4 {times} 10{sup {minus}8} per reactor-year.

  16. Double-ended break probability estimate for the 304 stainless steel main circulation piping of a production reactor

    SciTech Connect

    Mehta, H.S. ); Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L. )

    1991-01-01

    The large break frequency resulting from intergranular stress corrosion cracking in the main circulation piping of the Savannah River Site (SRS) production reactors has been estimated. Four factors are developed to describe the likelihood that a crack exists that is not identified by ultrasonic inspection, and that grows to instability prior to growing through-wall and being detected by the ensuing leakage. The estimated large break frequency is 3.4 {times} 10{sup {minus}8} per reactor-year.

  17. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  18. Workshop on models to estimate military system probability of effect (P/sub E/) due to incident radiofrequency energy: Volume 3, Written inputs from reviewers and other participants

    SciTech Connect

    Not Available

    1988-01-01

    The workshop on Models to Estimate Military System P/sub E/ (probability of effect) due to Incident Radio Frequency (RF) Energy was convened by Dr. John M. MacCallum, OUSDA (RandAT/EST), to assess the current state of the art and to evaluate the adequacy of ongoing effects assessment efforts to estimate P/sub E/. Approximately fifty people from government, industry, and academia attended the meeting. Specifically, the workshop addressed the following: (1) current status of operations research models for assessing probability of effect (P/sub E/) for red and blue mission analyses; (2) the main overall approaches for evaluating P/sub E/'s; (3) sources of uncertainty and ways P/sub E/'s could be credibly derived from the existing data base; and (4) the adequacy of the present framework of a national HPM assessment methodology for evaluation of P/sub E/'s credibility for future systems. 9 figs.

  19. ESTIMATING RISK TO CALIFORNIA ENERGY INFRASTRUCTURE FROM PROJECTED CLIMATE CHANGE

    SciTech Connect

    Sathaye, Jayant; Dale, Larry; Larsen, Peter; Fitts, Gary; Koy, Kevin; Lewis, Sarah; Lucena, Andre

    2011-06-22

    This report outlines the results of a study of the impact of climate change on the energy infrastructure of California and the San Francisco Bay region, including impacts on power plant generation; transmission line and substation capacity during heat spells; wildfires near transmission lines; sea level encroachment upon power plants, substations, and natural gas facilities; and peak electrical demand. Some end-of-century impacts were projected:Expected warming will decrease gas-fired generator efficiency. The maximum statewide coincident loss is projected at 10.3 gigawatts (with current power plant infrastructure and population), an increase of 6.2 percent over current temperature-induced losses. By the end of the century, electricity demand for almost all summer days is expected to exceed the current ninetieth percentile per-capita peak load. As much as 21 percent growth is expected in ninetieth percentile peak demand (per-capita, exclusive of population growth). When generator losses are included in the demand, the ninetieth percentile peaks may increase up to 25 percent. As the climate warms, California's peak supply capacity will need to grow faster than the population.Substation capacity is projected to decrease an average of 2.7 percent. A 5C (9F) air temperature increase (the average increase predicted for hot days in August) will diminish the capacity of a fully-loaded transmission line by an average of 7.5 percent.The potential exposure of transmission lines to wildfire is expected to increase with time. We have identified some lines whose probability of exposure to fire are expected to increase by as much as 40 percent. Up to 25 coastal power plants and 86 substations are at risk of flooding (or partial flooding) due to sea level rise.

  20. Childhood CT scans and cancer risk: impact of predisposing factors for cancer on the risk estimates.

    PubMed

    Journy, N; Roué, T; Cardis, E; Le Pointe, H Ducou; Brisse, H; Chateil, J-F; Laurier, D; Bernier, M-O

    2016-03-01

    To investigate the role of cancer predisposing factors (PFs) on the associations between paediatric computed tomography (CT) scan exposures and subsequent risk of central nervous system (CNS) tumours and leukaemia. A cohort of children who underwent a CT scan in 2000-2010 in 23 French radiology departments was linked with the national childhood cancers registry and national vital status registry; information on PFs was retrieved through hospital discharge databases. In children without PF, hazard ratios of 1.07 (95% CI 0.99-1.10) for CNS tumours (15 cases) and 1.16 (95% CI 0.77-1.27) for leukaemia (12 cases) were estimated for each 10 mGy increment in CT x-rays organ doses. These estimates were similar to those obtained in the whole cohort. In children with PFs, no positive dose-risk association was observed, possibly related to earlier non-cancer mortality in this group. Our results suggest a modifying effect of PFs on CT-related cancer risks, but need to be confirmed by longer follow-up and other studies. PMID:26878249

  1. Potential risk of using General Estimates System: bicycle safety.

    PubMed

    Kweon, Young-Jun; Lee, Joyoung

    2010-11-01

    Beneficial effects of bicycle helmet use have been reported mostly based on medical or survey data collected from hospitals. This study was to examine the validity of the United States General Estimates System (GES) database familiar to many transportation professionals for a beneficial effect of helmet use in reducing the severity of injury to bicyclists and found potential risk of erroneous conclusions that can be drawn by a narrowly focused study when the GES database is used. Although the focus of the study was on bicycle helmet use, its findings regarding potential risk might be true for any type of traffic safety study using the GES data. A partial proportional odds model reflecting intrinsic ordering of injury severity was mainly used. About 16,000 bicycle-involved traffic crash records occurring in 2003 through 2008 in the United States were extracted from the GES database. Using the 2003-2008 GES data, a beneficial effect of helmet use was found in 2007, yet a detrimental effect in 2004 and no effect in 2003, 2005, 2006, and 2008, which are contrary to the past findings from medical or hospital survey data. It was speculated that these mixed results might be attributable to a possible lack of representation of the GES data for bicycle-involved traffic crashes, which may be supported by the findings, such as the average helmet use rates at the time of the crashes varying from 12% in 2004 to 38% in 2008. This suggests that the GES data may not be a reliable source for studying narrowly focused issues such as the effect of helmet use. A considerable fluctuation over years in basic statistical values (e.g., average) of variables of interest (e.g., helmet use) may be an indication of a possible lack of representation of the GES data. In such a case, caution should be exercised in interpreting and generalizing analysis results. PMID:20728621

  2. The Impact of Perceived Frailty on Surgeons’ Estimates of Surgical Risk

    PubMed Central

    Ferguson, Mark K.; Farnan, Jeanne; Hemmerich, Josh A.; Slawinski, Kris; Acevedo, Julissa; Small, Stephen

    2015-01-01

    Background Physicians are only moderately accurate in estimating surgical risk based on clinical vignettes. We assessed the impact of perceived frailty by measuring the influence of a short video of a standardized patient on surgical risk estimates. Methods Thoracic surgeons and cardiothoracic trainees estimated the risk of major complications for lobectomy based on clinical vignettes of varied risk categories (low, average, high). After each vignette, subjects viewed a randomly selected video of a standardized patient exhibiting either vigorous or frail behavior, then re-estimated risk. Subjects were asked to rate 5 vignettes paired with 5 different standardized patients. Results Seventy-one physicians participated. Initial risk estimates varied according to the vignette risk category: low, 15.2 ± 11.2% risk; average, 23.7 ± 16.1%; high, 37.3 ± 18.9%; p<0.001 by ANOVA. Concordant information in vignettes and videos moderately altered estimates (high risk vignette, frail video: 10.6 ± 27.5% increase in estimate, p=0.006; low risk vignette, vigorous video: 14.5 ± 45.0% decrease, p=0.009). Discordant findings influenced risk estimates more substantially (high risk vignette, vigorous video: 21.2 ± 23.5% decrease in second risk estimate, p<0.001; low risk vignette, frail video: 151.9 ± 209.8% increase, p<0.001). Conclusions Surgeons differentiated relative risk of lobectomy based on clinical vignettes. The effect of viewing videos was small when vignettes and videos were concordant; the effect was more substantial when vignettes and videos were discordant. The information will be helpful in training future surgeons in frailty recognition and risk estimation. PMID:24932570

  3. Estimating urban flood risk - uncertainty in design criteria

    NASA Astrophysics Data System (ADS)

    Newby, M.; Franks, S. W.; White, C. J.

    2015-06-01

    The design of urban stormwater infrastructure is generally performed assuming that climate is static. For engineering practitioners, stormwater infrastructure is designed using a peak flow method, such as the Rational Method as outlined in the Australian Rainfall and Runoff (AR&R) guidelines and estimates of design rainfall intensities. Changes to Australian rainfall intensity design criteria have been made through updated releases of the AR&R77, AR&R87 and the recent 2013 AR&R Intensity Frequency Distributions (IFDs). The primary focus of this study is to compare the three IFD sets from 51 locations Australia wide. Since the release of the AR&R77 IFDs, the duration and number of locations for rainfall data has increased and techniques for data analysis have changed. Updated terminology coinciding with the 2013 IFD release has also resulted in a practical change to the design rainfall. For example, infrastructure that is designed for a 1 : 5 year ARI correlates with an 18.13% AEP, however for practical purposes, hydraulic guidelines have been updated with the more intuitive 20% AEP. The evaluation of design rainfall variation across Australia has indicated that the changes are dependent upon location, recurrence interval and rainfall duration. The changes to design rainfall IFDs are due to the application of differing data analysis techniques, the length and number of data sets and the change in terminology from ARI to AEP. Such changes mean that developed infrastructure has been designed to a range of different design criteria indicating the likely inadequacy of earlier developments to the current estimates of flood risk. In many cases, the under-design of infrastructure is greater than the expected impact of increased rainfall intensity under climate change scenarios.

  4. Efficiency of using correlation function for estimation of probability of substance detection on the base of THz spectral dynamics

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Peskov, Nikolay V.; Kirillov, Dmitry A.

    2012-10-01

    One of the problems arising in Time-Domain THz spectroscopy for the problem of security is the developing the criteria for assessment of probability for the detection and identification of the explosive and drugs. We analyze the efficiency of using the correlation function and another functional (more exactly, spectral norm) for this aim. These criteria are applied to spectral lines dynamics. For increasing the reliability of the assessment we subtract the averaged value of THz signal during time of analysis of the signal: it means deleting the constant from this part of the signal. Because of this, we can increase the contrast of assessment. We compare application of the Fourier-Gabor transform with unbounded (for example, Gaussian) window, which slides along the signal, for finding the spectral lines dynamics with application of the Fourier transform in short time interval (FTST), in which the Fourier transform is applied to parts of the signals, for the same aim. These methods are close each to other. Nevertheless, they differ by series of frequencies which they use. It is important for practice that the optimal window shape depends on chosen method for obtaining the spectral dynamics. The probability enhancements if we can find the train of pulses with different frequencies, which follow sequentially. We show that there is possibility to get pure spectral lines dynamics even under the condition of distorted spectrum of the substance response on the action of the THz pulse.

  5. Estimation of the detection probability for Yangtze finless porpoises (Neophocaena phocaenoides asiaeorientalis) with a passive acoustic method.

    PubMed

    Akamatsu, T; Wang, D; Wang, K; Li, S; Dong, S; Zhao, X; Barlow, J; Stewart, B S; Richlen, M

    2008-06-01

    Yangtze finless porpoises were surveyed by using simultaneous visual and acoustical methods from 6 November to 13 December 2006. Two research vessels towed stereo acoustic data loggers, which were used to store the intensity and sound source direction of the high frequency sonar signals produced by finless porpoises at detection ranges up to 300 m on each side of the vessel. Simple stereo beam forming allowed the separation of distinct biosonar sound source, which enabled us to count the number of vocalizing porpoises. Acoustically, 204 porpoises were detected from one vessel and 199 from the other vessel in the same section of the Yangtze River. Visually, 163 and 162 porpoises were detected from two vessels within 300 m of the vessel track. The calculated detection probability using acoustic method was approximately twice that for visual detection for each vessel. The difference in detection probabilities between the two methods was caused by the large number of single individuals that were missed by visual observers. However, the sizes of large groups were underestimated by using the acoustic methods. Acoustic and visual observations complemented each other in the accurate detection of porpoises. The use of simple, relatively inexpensive acoustic monitoring systems should enhance population surveys of free-ranging, echolocating odontocetes. PMID:18537391

  6. RADON EXPOSURE ASSESSMENT AND DOSIMETRY APPLIED TO EPIDEMIOLOGY AND RISK ESTIMATION

    EPA Science Inventory

    Epidemiological studies of underground miners provide the primary basis for radon risk estimates for indoor exposures as well as mine exposures. A major source of uncertainty in these risk estimates is the uncertainty in radon progeny exposure estimates for the miners. In addit...

  7. Managing and understanding risk perception of surface leaks from CCS sites: risk assessment for emerging technologies and low-probability, high-consequence events

    NASA Astrophysics Data System (ADS)

    Augustin, C. M.

    2015-12-01

    Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a

  8. Managing and understanding risk perception of surface leaks from CCS sites: risk assessment for emerging technologies and low-probability, high-consequence events

    NASA Astrophysics Data System (ADS)

    Anagnostou, E. N.; Seyyedi, H.; Beighley, E., II; McCollum, J.

    2014-12-01

    Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a

  9. Estimated probabilities, volumes, and inundation areas depths of potential postwildfire debris flows from Carbonate, Slate, Raspberry, and Milton Creeks, near Marble, Gunnison County, Colorado

    USGS Publications Warehouse

    Stevens, Michael R.; Flynn, Jennifer L.; Stephens, Verlin C.; Verdin, Kristine L.

    2011-01-01

    During 2009, the U.S. Geological Survey, in cooperation with Gunnison County, initiated a study to estimate the potential for postwildfire debris flows to occur in the drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble, Colorado. Currently (2010), these drainage basins are unburned but could be burned by a future wildfire. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the intermountain western United States were used to estimate the probability of postwildfire debris-flow occurrence and debris-flow volumes for drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble. Data for the postwildfire debris-flow models included drainage basin area; area burned and burn severity; percentage of burned area; soil properties; rainfall total and intensity for the 5- and 25-year-recurrence, 1-hour-duration-rainfall; and topographic and soil property characteristics of the drainage basins occupied by the four creeks. A quasi-two-dimensional floodplain computer model (FLO-2D) was used to estimate the spatial distribution and the maximum instantaneous depth of the postwildfire debris-flow material during debris flow on the existing debris-flow fans that issue from the outlets of the four major drainage basins. The postwildfire debris-flow probabilities at the outlet of each drainage basin range from 1 to 19 percent for the 5-year-recurrence, 1-hour-duration rainfall, and from 3 to 35 percent for 25-year-recurrence, 1-hour-duration rainfall. The largest probabilities for postwildfire debris flow are estimated for Raspberry Creek (19 and 35 percent), whereas estimated debris-flow probabilities for the three other creeks range from 1 to 6 percent. The estimated postwildfire debris-flow volumes at the outlet of each creek range from 7,500 to 101,000 cubic meters for the 5-year-recurrence, 1-hour-duration rainfall, and from 9,400 to 126,000 cubic meters for

  10. Stochastic estimates of exposure and cancer risk from carbon tetrachloride released to the air from the rocky flats plant.

    PubMed

    Rood, A S; McGavran, P D; Aanenson, J W; Till, J E

    2001-08-01

    Carbon tetrachloride is a degreasing agent that was used at the Rocky Flats Plant (RFP) in Colorado to clean product components and equipment. The chemical is considered a volatile organic compound and a probable human carcinogen. During the time the plant operated (1953-1989), most of the carbon tetrachloride was released to the atmosphere through building exhaust ducts. A smaller amount was released to the air via evaporation from open-air burn pits and ground-surface discharge points. Airborne releases from the plant were conservatively estimated to be equivalent to the amount of carbon tetrachloride consumed annually by the plant, which was estimated to be between 3.6 and 180 Mg per year. This assumption was supported by calculations that showed that most of the carbon tetrachloride discharged to the ground surface would subsequently be released to the atmosphere. Atmospheric transport of carbon tetrachloride from the plant to the surrounding community was estimated using a Gaussian Puff dispersion model (RATCHET). Time-integrated concentrations were estimated for nine hypothetical but realistic exposure scenarios that considered variation in lifestyle, location, age, and gender. Uncertainty distributions were developed for cancer slope factors and atmospheric dispersion factors. These uncertainties were propagated through to the final risk estimate using Monte Carlo techniques. The geometric mean risk estimates varied from 5.2 x 10(-6) for a hypothetical rancher or laborer working near the RFP to 3.4 x 10(-9) for an infant scenario. The distribution of incremental lifetime cancer incidence risk for the hypothetical rancher was between 1.3 x 10(-6) (5% value) and 2.1 x 10(-5) (95% value). These estimates are similar to or exceed estimated cancer risks posed by releases of radionuclides from the site. PMID:11726020

  11. State Estimates of Adolescent Cigarette Use and Perceptions of Risk of Smoking: 2012 and 2013

    MedlinePlus

    ... 2015 STATE ESTIMATES OF ADOLESCENT CIGARETTE USE AND PERCEPTIONS OF RISK OF SMOKING: 2012 AND 2013 AUTHORS ... with an inverse association between use and risk perceptions (i.e., the prevalence of use is lower ...

  12. Risk Estimates From an Online Risk Calculator Are More Believable and Recalled Better When Expressed as Integers

    PubMed Central

    Zikmund-Fisher, Brian J; Waters, Erika A; Gavaruzzi, Teresa; Fagerlin, Angela

    2011-01-01

    Background Online risk calculators offer different levels of precision in their risk estimates. People interpret numbers in varying ways depending on how they are presented, and we do not know how the number of decimal places displayed might influence perceptions of risk estimates. Objective The objective of our study was to determine whether precision (ie, number of decimals) in risk estimates offered by an online risk calculator influences users’ ratings of (1) how believable the estimate is, (2) risk magnitude (ie, how large or small the risk feels to them), and (3) how well they can recall the risk estimate after a brief delay. Methods We developed two mock risk calculator websites that offered hypothetical percentage estimates of participants’ lifetime risk of kidney cancer. Participants were randomly assigned to a condition where the risk estimate value rose with increasing precision (2, 2.1, 2.13, 2.133) or the risk estimate value fell with increasing precision (2, 1.9, 1.87, 1.867). Within each group, participants were randomly assigned one of the four numbers as their first risk estimate, and later received one of the remaining three as a comparison. Results Participants who completed the experiment (N = 3422) were a demographically diverse online sample, approximately representative of the US adult population on age, gender, and race. Participants whose risk estimates had no decimal places gave the highest ratings of believability (F 3,3384 = 2.94, P = .03) and the lowest ratings of risk magnitude (F 3,3384 = 4.70, P = .003). Compared to estimates with decimal places, integer estimates were judged as highly believable by 7%–10% more participants (χ2 3 =17.8, P < .001). When comparing two risk estimates with different levels of precision, large majorities of participants reported that the numbers seemed equivalent across all measures. Both exact and approximate recall were highest for estimates with zero decimals. Odds ratios (OR) for correct

  13. ESTIMATED SIL LEVELS AND RISK COMPARISONS FOR RELIEF VALVES AS A FUNCTION OF TIME-IN-SERVICE

    SciTech Connect

    Harris, S.

    2012-03-26

    Risk-based inspection methods enable estimation of the probability of spring-operated relief valves failing on demand at the United States Department of Energy's Savannah River Site (SRS) in Aiken, South Carolina. The paper illustrates an approach based on application of the Frechet and Weibull distributions to SRS and Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD) proof test results. The methodology enables the estimation of ANSI/ISA-84.00.01 Safety Integrity Levels (SILs) as well as the potential change in SIL level due to modification of the maintenance schedule. Current SRS practices are reviewed and recommendations are made for extending inspection intervals. The paper compares risk-based inspection with specific SILs as maintenance intervals are adjusted. Groups of valves are identified in which maintenance times can be extended as well as different groups in which an increased safety margin may be needed.

  14. Cancer risk estimation caused by radiation exposure during endovascular procedure

    NASA Astrophysics Data System (ADS)

    Kang, Y. H.; Cho, J. H.; Yun, W. S.; Park, K. H.; Kim, H. G.; Kwon, S. M.

    2014-05-01

    The objective of this study was to identify the radiation exposure dose of patients, as well as staff caused by fluoroscopy for C-arm-assisted vascular surgical operation and to estimate carcinogenic risk due to such exposure dose. The study was conducted in 71 patients (53 men and 18 women) who had undergone vascular surgical intervention at the division of vascular surgery in the University Hospital from November of 2011 to April of 2012. It had used a mobile C-arm device and calculated the radiation exposure dose of patient (dose-area product, DAP). Effective dose was measured by attaching optically stimulated luminescence on the radiation protectors of staff who participates in the surgery to measure the radiation exposure dose of staff during the vascular surgical operation. From the study results, DAP value of patients was 308.7 Gy cm2 in average, and the maximum value was 3085 Gy cm2. When converted to the effective dose, the resulted mean was 6.2 m Gy and the maximum effective dose was 61.7 milliSievert (mSv). The effective dose of staff was 3.85 mSv; while the radiation technician was 1.04 mSv, the nurse was 1.31 mSv. All cancer incidences of operator are corresponding to 2355 persons per 100,000 persons, which deemed 1 of 42 persons is likely to have all cancer incidences. In conclusion, the vascular surgeons should keep the radiation protection for patient, staff, and all participants in the intervention in mind as supervisor of fluoroscopy while trying to understand the effects by radiation by themselves to prevent invisible danger during the intervention and to minimize the harm.

  15. The 2006 William Feinberg lecture: shifting the paradigm from stroke to global vascular risk estimation.

    PubMed

    Sacco, Ralph L

    2007-06-01

    By the year 2010, it is estimated that 18.1 million people worldwide will die annually because of cardiovascular diseases and stroke. "Global vascular risk" more broadly includes the multiple overlapping disease silos of stroke, myocardial infarction, peripheral arterial disease, and vascular death. Estimation of global vascular risk requires consideration of a variety of variables including demographics, environmental behaviors, and risk factors. Data from multiple studies suggest continuous linear relationships between the physiological vascular risk modulators of blood pressure, lipids, and blood glucose rather than treating these conditions as categorical risk factors. Constellations of risk factors may be more relevant than individual categorical components. Exciting work with novel risk factors may also have predictive value in estimates of global vascular risk. Advances in imaging have led to the measurement of subclinical conditions such as carotid intima-media thickness and subclinical brain conditions such as white matter hyperintensities and silent infarcts. These subclinical measurements may be intermediate stages in the transition from asymptomatic to symptomatic vascular events, appear to be associated with the fundamental vascular risk factors, and represent opportunities to more precisely quantitate disease progression. The expansion of studies in molecular epidemiology and detection of genetic markers underlying vascular risks also promises to extend our precision of global vascular risk estimation. Global vascular risk estimation will require quantitative methods that bundle these multi-dimensional data into more precise estimates of future risk. The power of genetic information coupled with data on demographics, risk-inducing behaviors, vascular risk modulators, biomarkers, and measures of subclinical conditions should provide the most realistic approximation of an individual's future global vascular risk. The ultimate public health benefit

  16. Estimation of Wild Fire Risk Area based on Climate and Maximum Entropy in Korean Peninsular

    NASA Astrophysics Data System (ADS)

    Kim, T.; Lim, C. H.; Song, C.; Lee, W. K.

    2015-12-01

    The number of forest fires and accompanying human injuries and physical damages has been increased by frequent drought. In this study, forest fire danger zone of Korea is estimated to predict and prepare for future forest fire hazard regions. The MaxEnt (Maximum Entropy) model is used to estimate the forest fire hazard region which estimates the probability distribution of the status. The MaxEnt model is primarily for the analysis of species distribution, but its applicability for various natural disasters is getting recognition. The detailed forest fire occurrence data collected by the MODIS for past 5 years (2010-2014) is used as occurrence data for the model. Also meteorology, topography, vegetation data are used as environmental variable. In particular, various meteorological variables are used to check impact of climate such as annual average temperature, annual precipitation, precipitation of dry season, annual effective humidity, effective humidity of dry season, aridity index. Consequently, the result was valid based on the AUC(Area Under the Curve) value (= 0.805) which is used to predict accuracy in the MaxEnt model. Also predicted forest fire locations were practically corresponded with the actual forest fire distribution map. Meteorological variables such as effective humidity showed the greatest contribution, and topography variables such as TWI (Topographic Wetness Index) and slope also contributed on the forest fire. As a result, the east coast and the south part of Korea peninsula were predicted to have high risk on the forest fire. In contrast, high-altitude mountain area and the west coast appeared to be safe with the forest fire. The result of this study is similar with former studies, which indicates high risks of forest fire in accessible area and reflects climatic characteristics of east and south part in dry season. To sum up, we estimated the forest fire hazard zone with existing forest fire locations and environment variables and had

  17. A Review of Mycotoxins in Food and Feed Products in Portugal and Estimation of Probable Daily Intakes.

    PubMed

    Abrunhosa, Luís; Morales, Héctor; Soares, Célia; Calado, Thalita; Vila-Chã, Ana Sofia; Pereira, Martinha; Venâncio, Armando

    2016-01-01

    Mycotoxins are toxic secondary metabolites produced by filamentous fungi that occur naturally in agricultural commodities worldwide. Aflatoxins, ochratoxin A, patulin, fumonisins, zearalenone, trichothecenes, and ergot alkaloids are presently the most important for food and feed safety. These compounds are produced by several species that belong to the Aspergillus, Penicillium, Fusarium, and Claviceps genera and can be carcinogenic, mutagenic, teratogenic, cytotoxic, neurotoxic, nephrotoxic, estrogenic, and immunosuppressant. Human and animal exposure to mycotoxins is generally assessed by taking into account data on the occurrence of mycotoxins in food and feed as well as data on the consumption patterns of the concerned population. This evaluation is crucial to support measures to reduce consumer exposure to mycotoxins. This work reviews the occurrence and levels of mycotoxins in Portuguese food and feed to provide a global overview of this issue in Portugal. With the information collected, the exposure of the Portuguese population to those mycotoxins is assessed, and the estimated dietary intakes are presented. PMID:24987806

  18. Latent-failure risk estimates for computer control

    NASA Technical Reports Server (NTRS)

    Dunn, William R.; Folsom, Rolfe A.; Green, Owen R.

    1991-01-01

    It is shown that critical computer controls employing unmonitored safety circuits are unsafe. Analysis supporting this result leads to two additional, important conclusions: (1) annual maintenance checks of safety circuit function do not, as widely believed, eliminate latent failure risk; (2) safety risk remains even if multiple, series-connected protection circuits are employed. Finally, it is shown analytically that latent failure risk is eliminated when continuous monitoring is employed.

  19. State Estimates of Adolescent Marijuana Use and Perceptions of Risk of Harm from Marijuana Use: 2013 and 2014

    MedlinePlus

    ... 2014 estimates to 2012–2013 estimates). However, youth perceptions of great risk of harm from monthly marijuana ... change. State Estimates of Adolescent Marijuana Use and Perceptions of Risk of Harm From Marijuana Use: 2013 ...

  20. Nonparametric Estimation of the Probability of Detection of Flaws in an Industrial Component, from Destructive and Nondestructive Testing Data, Using Approximate Bayesian Computation.

    PubMed

    Keller, Merlin; Popelin, Anne-Laure; Bousquet, Nicolas; Remy, Emmanuel

    2015-09-01

    We consider the problem of estimating the probability of detection (POD) of flaws in an industrial steel component. Modeled as an increasing function of the flaw height, the POD characterizes the detection process; it is also involved in the estimation of the flaw size distribution, a key input parameter of physical models describing the behavior of the steel component when submitted to extreme thermodynamic loads. Such models are used to assess the resistance of highly reliable systems whose failures are seldom observed in practice. We develop a Bayesian method to estimate the flaw size distribution and the POD function, using flaw height measures from periodic in-service inspections conducted with an ultrasonic detection device, together with measures from destructive lab experiments. Our approach, based on approximate Bayesian computation (ABC) techniques, is applied to a real data set and compared to maximum likelihood estimation (MLE) and a more classical approach based on Markov Chain Monte Carlo (MCMC) techniques. In particular, we show that the parametric model describing the POD as the cumulative distribution function (cdf) of a log-normal distribution, though often used in this context, can be invalidated by the data at hand. We propose an alternative nonparametric model, which assumes no predefined shape, and extend the ABC framework to this setting. Experimental results demonstrate the ability of this method to provide a flexible estimation of the POD function and describe its uncertainty accurately. PMID:26414699

  1. Indoor radon and lung cancer. Estimating the risks.

    PubMed Central

    Samet, J. M.

    1992-01-01

    Radon is ubiquitous in indoor environments. Epidemiologic studies of underground miners with exposure to radon and experimental evidence have established that radon causes lung cancer. The finding that this naturally occurring carcinogen is present in the air of homes and other buildings has raised concern about the lung cancer risk to the general population from radon. I review current approaches for assessing the risk of indoor radon, emphasizing the extrapolation of the risks for miners to the general population. Although uncertainties are inherent in this risk assessment, the present evidence warrants identifying homes that have unacceptably high concentrations. PMID:1734594

  2. Occurrence probability of slopes on the lunar surface: Estimate by the shaded area percentage in the LROC NAC images

    NASA Astrophysics Data System (ADS)

    Abdrakhimov, A. M.; Basilevsky, A. T.; Ivanov, M. A.; Kokhanov, A. A.; Karachevtseva, I. P.; Head, J. W.

    2015-09-01

    The paper describes the method of estimating the distribution of slopes by the portion of shaded areas measured in the images acquired at different Sun elevations. The measurements were performed for the benefit of the Luna-Glob Russian mission. The western ellipse for the spacecraft landing in the crater Bogus-lawsky in the southern polar region of the Moon was investigated. The percentage of the shaded area was measured in the images acquired with the LROC NAC camera with a resolution of ~0.5 m. Due to the close vicinity of the pole, it is difficult to build digital terrain models (DTMs) for this region from the LROC NAC images. Because of this, the method described has been suggested. For the landing ellipse investigated, 52 LROC NAC images obtained at the Sun elevation from 4° to 19° were used. In these images the shaded portions of the area were measured, and the values of these portions were transferred to the values of the occurrence of slopes (in this case, at the 3.5-m baseline) with the calibration by the surface characteristics of the Lunokhod-1 study area. For this area, the digital terrain model of the ~0.5-m resolution and 13 LROC NAC images obtained at different elevations of the Sun are available. From the results of measurements and the corresponding calibration, it was found that, in the studied landing ellipse, the occurrence of slopes gentler than 10° at the baseline of 3.5 m is 90%, while it is 9.6, 5.7, and 3.9% for the slopes steeper than 10°, 15°, and 20°, respectively. Obviously, this method can be recommended for application if there is no DTM of required granularity for the regions of interest, but there are high-resolution images taken at different elevations of the Sun.

  3. Estimating the probability distribution of the incubation period for rabies using data from the 1948-1954 rabies epidemic in Tokyo.

    PubMed

    Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K

    2016-01-01

    Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. PMID:26688561

  4. CCSI Risk Estimation: An Application of Expert Elicitation

    SciTech Connect

    Engel, David W.; Dalton, Angela C.

    2012-10-01

    The Carbon Capture Simulation Initiative (CCSI) is a multi-laboratory simulation-driven effort to develop carbon capture technologies with the goal of accelerating commercialization and adoption in the near future. One of the key CCSI technical challenges is representing and quantifying the inherent uncertainty and risks associated with developing, testing, and deploying the technology in simulated and real operational settings. To address this challenge, the CCSI Element 7 team developed a holistic risk analysis and decision-making framework. The purpose of this report is to document the CCSI Element 7 structured systematic expert elicitation to identify additional risk factors. We review the significance of and established approaches to expert elicitation, describe the CCSI risk elicitation plan and implementation strategies, and conclude by discussing the next steps and highlighting the contribution of risk elicitation toward the achievement of the overarching CCSI objectives.

  5. NEED FOR INDIVIDUAL CANCER RISK ESTIMATES IN X-RAY AND NUCLEAR MEDICINE IMAGING.

    PubMed

    Mattsson, Sören

    2016-06-01

    To facilitate the justification of an X-ray or nuclear medicine investigation and for informing patients, it is desirable that the individual patient's radiation dose and potential cancer risk can be prospectively assessed and documented. The current dose-reporting is based on effective dose, which ignores body size and does not reflect the strong dependence of risk on the age at exposure. Risk estimations should better be done through individual organ dose assessments, which need careful exposure characterisation as well as anatomical description of the individual patient. In nuclear medicine, reference biokinetic models should also be replaced with models describing individual physiological states and biokinetics. There is a need to adjust population-based cancer risk estimates to the possible risk of leukaemia and solid tumours for the individual depending on age and gender. The article summarises reasons for individual cancer risk estimates and gives examples of methods and results of such estimates. PMID:26994092

  6. What's the Risk? A Simple Approach for Estimating Adjusted Risk Measures from Nonlinear Models Including Logistic Regression

    PubMed Central

    Kleinman, Lawrence C; Norton, Edward C

    2009-01-01

    Objective To develop and validate a general method (called regression risk analysis) to estimate adjusted risk measures from logistic and other nonlinear multiple regression models. We show how to estimate standard errors for these estimates. These measures could supplant various approximations (e.g., adjusted odds ratio [AOR]) that may diverge, especially when outcomes are common. Study Design Regression risk analysis estimates were compared with internal standards as well as with Mantel–Haenszel estimates, Poisson and log-binomial regressions, and a widely used (but flawed) equation to calculate adjusted risk ratios (ARR) from AOR. Data Collection Data sets produced using Monte Carlo simulations. Principal Findings Regression risk analysis accurately estimates ARR and differences directly from multiple regression models, even when confounders are continuous, distributions are skewed, outcomes are common, and effect size is large. It is statistically sound and intuitive, and has properties favoring it over other methods in many cases. Conclusions Regression risk analysis should be the new standard for presenting findings from multiple regression analysis of dichotomous outcomes for cross-sectional, cohort, and population-based case–control studies, particularly when outcomes are common or effect size is large. PMID:18793213

  7. Multi-scale occupancy approach to estimate Toxoplasma gondii prevalence and detection probability in tissues: an application and guide for field sampling.

    PubMed

    Elmore, Stacey A; Huyvaert, Kathryn P; Bailey, Larissa L; Iqbal, Asma; Su, Chunlei; Dixon, Brent R; Alisauskas, Ray T; Gajadhar, Alvin A; Jenkins, Emily J

    2016-08-01

    Increasingly, birds are recognised as important hosts for the ubiquitous parasite Toxoplasma gondii, although little experimental evidence exists to determine which tissues should be tested to maximise the detection probability of T. gondii. Also, Arctic-nesting geese are suspected to be important sources of T. gondii in terrestrial Arctic ecosystems, but the parasite has not previously been reported in the tissues of these geese. Using a domestic goose model, we applied a multi-scale occupancy framework to demonstrate that the probability of detection of T. gondii was highest in the brain (0.689, 95% confidence interval=0.486, 0.839) and the heart (0.809, 95% confidence interval=0.693, 0.888). Inoculated geese had an estimated T. gondii infection probability of 0.849, (95% confidence interval=0.643, 0.946), highlighting uncertainty in the system, even under experimental conditions. Guided by these results, we tested the brains and hearts of wild Ross's Geese (Chen rossii, n=50) and Lesser Snow Geese (Chen caerulescens, n=50) from Karrak Lake, Nunavut, Canada. We detected 51 suspected positive tissue samples from 33 wild geese using real-time PCR with melt-curve analysis. The wild goose prevalence estimates generated by our multi-scale occupancy analysis were higher than the naïve estimates of prevalence, indicating that multiple PCR repetitions on the same organs and testing more than one organ could improve T. gondii detection. Genetic characterisation revealed Type III T. gondii alleles in six wild geese and Sarcocystis spp. in 25 samples. Our study demonstrates that Arctic nesting geese are capable of harbouring T. gondii in their tissues and could transport the parasite from their southern overwintering grounds into the Arctic region. We demonstrate how a multi-scale occupancy framework can be used in a domestic animal model to guide resource-limited sample collection and tissue analysis in wildlife. Secondly, we confirm the value of traditional occupancy in

  8. Estimating reach-specific fish movement probabilities in rivers with a Bayesian state-space model: application to sea lamprey passage and capture at dams

    USGS Publications Warehouse

    Holbrook, Christopher M.; Johnson, Nicholas S.; Steibel, Juan P.; Twohey, Michael B.; Binder, Thomas R.; Krueger, Charles C.; Jones, Michael L.

    2014-01-01

    Improved methods are needed to evaluate barriers and traps for control and assessment of invasive sea lamprey (Petromyzon marinus) in the Great Lakes. A Bayesian state-space model provided reach-specific probabilities of movement, including trap capture and dam passage, for 148 acoustic tagged invasive sea lamprey in the lower Cheboygan River, Michigan, a tributary to Lake Huron. Reach-specific movement probabilities were combined to obtain estimates of spatial distribution and abundance needed to evaluate a barrier and trap complex for sea lamprey control and assessment. Of an estimated 21 828 – 29 300 adult sea lampreys in the river, 0%–2%, or 0–514 untagged lampreys, could have passed upstream of the dam, and 46%–61% were caught in the trap. Although no tagged lampreys passed above the dam (0/148), our sample size was not sufficient to consider the lock and dam a complete barrier to sea lamprey. Results also showed that existing traps are in good locations because 83%–96% of the population was vulnerable to existing traps. However, only 52%–69% of lampreys vulnerable to traps were caught, suggesting that traps can be improved. The approach used in this study was a novel use of Bayesian state-space models that may have broader applications, including evaluation of barriers for other invasive species (e.g., Asian carp (Hypophthalmichthys spp.)) and fish passage structures for other diadromous fishes.

  9. A technique for estimating the probability of radiation-stimulated failures of integrated microcircuits in low-intensity radiation fields: Application to the Spektr-R spacecraft

    NASA Astrophysics Data System (ADS)

    Popov, V. D.; Khamidullina, N. M.

    2006-10-01

    In developing radio-electronic devices (RED) of spacecraft operating in the fields of ionizing radiation in space, one of the most important problems is the correct estimation of their radiation tolerance. The “weakest link” in the element base of onboard microelectronic devices under radiation effect is the integrated microcircuits (IMC), especially of large scale (LSI) and very large scale (VLSI) degree of integration. The main characteristic of IMC, which is taken into account when making decisions on using some particular type of IMC in the onboard RED, is the probability of non-failure operation (NFO) at the end of the spacecraft’s lifetime. It should be noted that, until now, the NFO has been calculated only from the reliability characteristics, disregarding the radiation effect. This paper presents the so-called “reliability” approach to determination of radiation tolerance of IMC, which allows one to estimate the probability of non-failure operation of various types of IMC with due account of radiation-stimulated dose failures. The described technique is applied to RED onboard the Spektr-R spacecraft to be launched in 2007.

  10. Workshop on models to estimate military system probability of effect (P/sub E/) due to incident radiofrequency energy: Volume I, Findings and recommendations

    SciTech Connect

    Cabayan, H.S.

    1988-01-01

    The workshop on Models to Estimate Military System P/sub E/ (probability of effect) due to Incident Radio Frequency (RF) Energy was convened by Dr. John M. MacCallum, OUSDA (R and AT/EST), to assess the current state of the art and to evaluate the adequacy of ongoing effects assessment efforts to estimate P/sub E/. Approximately fifty people from government, industry, and academia attended the meeting. Specifically, the workshop addressed the following: current status of operations research models for assessing probability of effect (P/sub E/) for red and blue mission analyses; the main overall approaches for evaluating P/sub E/'s; sources of uncertainty and ways P/sub E/'s could be credibly derived from the existing data base; and the adequacy of the present framework of a national HPM assessment methodology for evaluation of P/sub E/'s credibility for future systems. Military operations research (MOR) analyses need to support current and future high power RF device development and operational employment evaluations. USDA (R and AT/EST) sponsored this workshop in an effort to assess MOR's current capability and its maturity. Participants included service, OSD, national laboratory, contractor, and academic experts and practitioners in this emerging technology area. Following is a summary of major findings and recommendations. 1 tab.

  11. Using probability-based spatial estimation of the river pollution index to assess urban water recreational quality in the Tamsui River watershed.

    PubMed

    Jang, Cheng-Shin

    2016-01-01

    The Tamsui River watershed situated in Northern Taiwan provides a variety of water recreational opportunities such as riverbank park activities, fishing, cruising, rowing, sailing, and swimming. However, river water quality strongly affects water recreational quality. Moreover, the health of recreationists who are partially or fully exposed to polluted river water may be jeopardized. A river pollution index (RPI) composed of dissolved oxygen, biochemical oxygen demand, suspended solids, and ammonia nitrogen is typically used to gauge the river water quality and regulate the water body use in Taiwan. The purpose of this study was to probabilistically determine the RPI categories in the Tamsui River watershed and to assess the urban water recreational quality on the basis of the estimated RPI categories. First, according to various RPI categories, one-dimensional indicator kriging (IK) was adopted to estimate the occurrence probabilities of the RPI categories. The maximum occurrence probability among the categories was then employed to determine the most suitable RPI category. Finally, the most serious categories and seasonal variations of RPI were adopted to evaluate the quality of current water recreational opportunities in the Tamsui River watershed. The results revealed that the midstream and downstream sections of the Tamsui River and its tributaries with poor river water quality afford low water recreational quality, and water recreationists should avoid full or limited exposure to these bodies of water. However, the upstream sections of the Tamsui River watershed with high river water quality are suitable for all water recreational activities. PMID:26676412

  12. Risk estimates for radiation-induced cancer--the epidemiological evidence.

    PubMed

    Kellerer, A M

    2000-03-01

    The risk of low-dose radiation exposures has--for a variety of reasons--been highly politicised. This has led to a frequently exaggerated perception of the potential health effects, and to lasting public controversies. A balanced view requires a critical reassessment of the epidemiological basis of current assumptions. There is reliable quantitative information available on the increase of cancer rates due to moderate and high doses. This provides a firm basis for the derivation of probabilities of causation, e.g. after high radiation exposures. For small doses or dose rates, the situation is entirely different: potential increases of cancer rates remain hidden below the statistical fluctuations of normal rates, and the molecular mechanisms of cancerogenesis are not sufficiently well known to allow numerical predictions. Risk coefficients for radiation protection must, therefore, be based on the uncertain extrapolation of observations obtained at moderate or high doses. While extrapolation is arbitrary, it is, nevertheless, used and mostly with the conservative assumption of a linear dose dependence with no threshold (LNT model). All risk estimates are based on this hypothesis. They are, thus, virtual guidelines, rather than firm numbers. The observations on the A-bomb survivors are still the major source of information on the health effects of comparatively small radiation doses. A fairly direct inspection of the data shows that the solid cancer mortality data of the A-bomb survivors are equally consistent with linearity in dose and with reduced effectiveness at low doses. In the leukemia data a reduction is strongly indicated. With one notable exception -- leukemia after prenatal exposure--these observations are in line with a multitude of observations in groups of persons exposed for medical reasons. The low-dose effects of densely ionizing radiations--such as alpha-particles from radon decay products or high-energy neutrons--are a separate important issue. For

  13. A computer procedure to analyze seismic data to estimate outcome probabilities in oil exploration, with an initial application in the tabasco region of southeastern Mexico

    NASA Astrophysics Data System (ADS)

    Berlanga, Juan M.; Harbaugh, John W.

    the basis of frequency distributions of trend-surface residuals obtained by fitting and subtracting polynomial trend surfaces from the machine-contoured reflection time maps. We found that there is a strong preferential relationship between the occurrence of petroleum (i.e. its presence versus absence) and particular ranges of trend-surface residual values. An estimate of the probability of oil occurring at any particular geographic point can be calculated on the basis of the estimated trend-surface residual value. This estimate, however, must be tempered by the probable error in the estimate of the residual value provided by the error function. The result, we believe, is a simple but effective procedure for estimating exploration outcome probabilities where seismic data provide the principal form of information in advance of drilling. Implicit in this approach is the comparison between a maturely explored area, for which both seismic and production data are available, and which serves as a statistical "training area", with the "target" area which is undergoing exploration and for which probability forecasts are to be calculated.

  14. Comparing an estimate of seabirds at risk to a mortality estimate from the November 2004 Terra Nova FPSO oil spill.

    PubMed

    Wilhelm, Sabina I; Robertson, Gregory J; Ryan, Pierre C; Schneider, David C

    2007-05-01

    On 21 November 2004, about 1000 barrels of crude oil were accidentally released from the Terra Nova FPSO (floating production, storage and offloading) onto the Grand Banks, approximately 340 km east-southeast of St. John's, Newfoundland. We estimated the number of vulnerable seabirds (murres (Uria spp.) and dovekies (Alle alle)) at risk from this incident by multiplying observed densities of seabirds with the total area covered by the slick, estimated at 793 km(2). A mean density of 3.46 murres/km(2) and 1.07 dovekies/km(2) on the sea surface was recorded during vessel-based surveys on 28 and 29 November 2004, with a mean density of 6.90 murres/km(2) and 13.43 dovekies/km(2) combining those on the sea and in flight. We calculated a mean of 9858 murres and dovekies were at risk of being oiled, with estimates ranging from 3593 to 16,122 depending on what portion of birds in flight were assumed to be at risk. A mortality model based on spill volume was derived independently of the risk model, and estimated that 4688 (CI 95%: 1905-12,480) birds were killed during this incident. A low mortality estimate based strictly on spill volume would be expected for this incident, which occurred in an area of relatively high seabird densities. Given that the risk and mortality estimates are statistically indistinguishable, we estimate that on the order of 10,000 birds were killed by the Terra Nova spill. PMID:17328926

  15. Space Radiation Cancer, Circulatory Disease and CNS Risks for Near Earth Asteroid and Mars Missions: Uncertainty Estimates for Never-Smokers

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Chappell, Lori J.; Wang, Minli; Kim, Myung-Hee

    2011-01-01

    The uncertainties in estimating the health risks from galactic cosmic rays (GCR) and solar particle events (SPE) are a major limitation to the length of space missions and the evaluation of potential risk mitigation approaches. NASA limits astronaut exposures to a 3% risk of exposure induced cancer death (REID), and protects against uncertainties in risks projections using an assessment of 95% confidence intervals after propagating the error from all model factors (environment and organ exposure, risk coefficients, dose-rate modifiers, and quality factors). Because there are potentially significant late mortality risks from diseases of the circulatory system and central nervous system (CNS) which are less well defined than cancer risks, the cancer REID limit is not necessarily conservative. In this report, we discuss estimates of lifetime risks from space radiation and new estimates of model uncertainties are described. The key updates to the NASA risk projection model are: 1) Revised values for low LET risk coefficients for tissue specific cancer incidence, with incidence rates transported to an average U.S. population to estimate the probability of Risk of Exposure Induced Cancer (REIC) and REID. 2) An analysis of smoking attributable cancer risks for never-smokers that shows significantly reduced lung cancer risk as well as overall cancer risks from radiation compared to risk estimated for the average U.S. population. 3) Derivation of track structure based quality functions depends on particle fluence, charge number, Z and kinetic energy, E. 4) The assignment of a smaller maximum in quality function for leukemia than for solid cancers. 5) The use of the ICRP tissue weights is shown to over-estimate cancer risks from SPEs by a factor of 2 or more. Summing cancer risks for each tissue is recommended as a more accurate approach to estimate SPE cancer risks. 6) Additional considerations on circulatory and CNS disease risks. Our analysis shows that an individual s

  16. Biomechanical Risk Estimates for Mild Traumatic Brain Injury

    PubMed Central

    Funk, J. R.; Duma, S. M.; Manoogian, S. J.; Rowson, S.

    2007-01-01

    The objective of this study was to characterize the risk of mild traumatic brain injury (MTBI) in living humans based on a large set of head impact data taken from American football players at the collegiate level. Real-time head accelerations were recorded from helmet-mounted accelerometers designed to stay in contact with the player’s head. Over 27,000 head impacts were recorded, including four impacts resulting in MTBI. Parametric risk curves were developed by normalizing MTBI incidence data by head impact exposure data. An important finding of this research is that living humans, at least in the setting of collegiate football, sustain much more significant head impacts without apparent injury than previously thought. The following preliminary nominal injury assessment reference values associated with a 10% risk of MTBI are proposed: a peak linear head acceleration of 165 g, a HIC of 400, and a peak angular head acceleration of 9000 rad/s2. PMID:18184501

  17. State of the art coronary heart disease risk estimations based on the Framingham heart study.

    PubMed

    Reissigová, J; Tomecková, M

    2005-12-01

    The aim was to review the most interesting articles dealing with estimations of an individual's absolute coronary heart disease risk based on the Framingham heart study. Besides the Framingham coronary heart disease risk functions, results of validation studies of these Framingham risk functions are discussed. In general, the Framingham risk functions overestimated an individual's absolute risk in external (non-Framingham) populations with a lower occurrence of coronary heart disease compared with the Framingham population, and underestimated it in populations with a higher occurrence of coronary heart disease. Even if the calibration accuracy of the Framingham risk functions were not satisfying, the Framingham risk functions were able to rank individuals according to risk from low-risk to high-risk groups, with the discrimination ability of 60% and more. PMID:16419382

  18. Estimation of flood discharges at selected annual exceedance probabilities for unregulated, rural streams in Vermont, with a section on Vermont regional skew regression

    USGS Publications Warehouse

    Olson, Scott A.; with a section by Veilleux, Andrea G.

    2014-01-01

    This report provides estimates of flood discharges at selected annual exceedance probabilities (AEPs) for streamgages in and adjacent to Vermont and equations for estimating flood discharges at AEPs of 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent (recurrence intervals of 2-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-years, respectively) for ungaged, unregulated, rural streams in Vermont. The equations were developed using generalized least-squares regression. Flood-frequency and drainage-basin characteristics from 145 streamgages were used in developing the equations. The drainage-basin characteristics used as explanatory variables in the regression equations include drainage area, percentage of wetland area, and the basin-wide mean of the average annual precipitation. The average standard errors of prediction for estimating the flood discharges at the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent AEP with these equations are 34.9, 36.0, 38.7, 42.4, 44.9, 47.3, 50.7, and 55.1 percent, respectively. Flood discharges at selected AEPs for streamgages were computed by using the Expected Moments Algorithm. To improve estimates of the flood discharges for given exceedance probabilities at streamgages in Vermont, a new generalized skew coefficient was developed. The new generalized skew for the region is a constant, 0.44. The mean square error of the generalized skew coefficient is 0.078. This report describes a technique for using results from the regression equations to adjust an AEP discharge computed from a streamgage record. This report also describes a technique for using a drainage-area adjustment to estimate flood discharge at a selected AEP for an ungaged site upstream or downstream from a streamgage. The final regression equations and the flood-discharge frequency data used in this study will be available in StreamStats. StreamStats is a World Wide Web application providing automated regression-equation solutions for user-selected sites on streams.

  19. Vanishing glaciers, degrading permafrost, new lakes and increasing probability of extreme floods from impact waves - a need for long-term risk reduction concerning high-mountain regions

    NASA Astrophysics Data System (ADS)

    Haeberli, Wilfried; Schaub, Yvonne; Huggel, Christian; Boeckli, Lorenz

    2013-04-01

    As a consequence of continued global warming, rapid and fundamental changes are taking place in high-mountain regions. Within decades only, many still existing glacier landscapes will probably transform into new and strongly different landscapes of bare bedrock, loose debris, numerous lakes and sparse vegetation. These new landscapes are then likely to persist for centuries if not millennia to come. During variable but mostly extended parts of this future time period, they will be characterised by pronounced disequilibria within their geo- and ecosystems. Such disequilibria include a long-term stability reduction of steep/icy mountain slopes as a slow and delayed reaction to stress redistribution following de-buttressing by vanishing glaciers and to changes in strength and hydraulic permeability caused by permafrost warming and degradation. With the formation of many new lakes in close neighbourhood to, or even directly at the foot of, so-affected slopes, the probability of far-reaching flood waves from large rock falls into lakes is likely to increase for extended time periods. Quantitative information for anticipating possible developments exists in the European Alps. The present (2011) glacier cover is some 1800 km2, the still existing total ice volume 80 ± 20 km3 and the average loss rate about -2 km3 ice per year. The permafrost area has recently been estimated at some 3000 km2 with a total subsurface ice volume of 25 ± 2 km3; loss rates are hardly known but are certainly much smaller than for glaciers - probably by at least a factor of 10. Based on a detailed study for the Swiss Alps, total future lake volume may be assumed to be a few percent of the presently remaining glacier volume, i.e., a few km3 for the entire Alps. Forward projection of such numbers into the future indicates that glacier volumes tend to much more rapidly vanish than volumes of subsurface ice in permafrost, and lake volumes are likely to steadily increase. Already during the second

  20. Risk Factors of Delay Proportional Probability in Diphtheria-tetanus-pertussis Vaccination of Iranian Children; Life Table Approach Analysis.

    PubMed

    Mokhtari, Mohsen; Rezaeimanesh, Masoomeh; Mohammadbeigi, Abolfazl; Zahraei, Seyed Mohsen; Mohammadsalehi, Narges; Ansari, Hossein

    2015-01-01

    Despite success in expanded program immunization for an increase in vaccination coverage in the children of world, timeliness and schedule of vaccination remains as one of the challenges in public health. This study purposed to demonstrate the related factors of delayed diphtheria-tetanus-pertussis (DTP) vaccination using life table approach. A historical cohort study conducted in the poor areas of five large Iran cities. Totally, 3610 children with 24-47 months old age who had documented vaccination card were enrolled. Time of vaccination for the third dose of DTP vaccine was calculated. Life table survival was used to calculate the proportional probability of vaccination in each time. Wilcoxon test was used for the comparison proportional probability of delayed vaccination based on studies factors. The overall median delayed time for DTP3 was 38.52 days. The Wilcoxon test showed that city, nationality, education level of parents, birth order and being in rural areas are related to the high probability of delay time for DTP3 vaccination (P < 0. 001). Moreover, child gender and parent's job were not significant factors (P > 0.05). Being away from the capital, a high concentration of immigrants in the city borders with a low socioeconomic class leads to prolonged delay in DTP vaccination time. Special attention to these areas is needed to increase the levels of parental knowledge and to facilitate access to the health services care. PMID:26752871

  1. Use of risk projection models to estimate mortality and incidence from radiation-induced breast cancer in screening programs

    NASA Astrophysics Data System (ADS)

    Ramos, M.; Ferrer, S.; Villaescusa, J. I.; Verdú, G.; Salas, M. D.; Cuevas, M. D.

    2005-02-01

    The authors report on a method to calculate radiological risks, applicable to breast screening programs and other controlled medical exposures to ionizing radiation. In particular, it has been applied to make a risk assessment in the Valencian Breast Cancer Early Detection Program (VBCEDP) in Spain. This method is based on a parametric approach, through Markov processes, of hazard functions for radio-induced breast cancer incidence and mortality, with mean glandular breast dose, attained age and age-at-exposure as covariates. Excess relative risk functions of breast cancer mortality have been obtained from two different case-control studies exposed to ionizing radiation, with different follow-up time: the Canadian Fluoroscopy Cohort Study (1950-1987) and the Life Span Study (1950-1985 and 1950-1990), whereas relative risk functions for incidence have been obtained from the Life Span Study (1958-1993), the Massachusetts tuberculosis cohorts (1926-1985 and 1970-1985), the New York post-partum mastitis patients (1930-1981) and the Swedish benign breast disease cohort (1958-1987). Relative risks from these cohorts have been transported to the target population undergoing screening in the Valencian Community, a region in Spain with about four and a half million inhabitants. The SCREENRISK software has been developed to estimate radiological detriments in breast screening. Some hypotheses corresponding to different screening conditions have been considered in order to estimate the total risk associated with a woman who takes part in all screening rounds. In the case of the VBCEDP, the total radio-induced risk probability for fatal breast cancer is in a range between [5 × 10-6, 6 × 10-4] versus the natural rate of dying from breast cancer in the Valencian Community which is 9.2 × 10-3. The results show that these indicators could be included in quality control tests and could be adequate for making comparisons between several screening programs.

  2. Estimating the probability of occurrence of earthquakes (M>6) in the Western part of the Corinth rift using fault-based and classical seismotectonic approaches.

    NASA Astrophysics Data System (ADS)

    Boiselet, Aurelien; Scotti, Oona; Lyon-Caen, Hélène

    2014-05-01

    -SISCOR Working Group. On the basis of this consensual logic tree, median probability of occurrences of M>=6 events were computed for the region of study. Time-dependent models (Brownian Passage time and Weibull probability distributions) were also explored. The probability of a M>=6.0 event is found to be greater in the western region compared to the eastern part of the Corinth rift, whether a fault-based or a classical seismotectonic approach is used. Percentile probability estimates are also provided to represent the range of uncertainties in the results. The percentile results show that, in general, probability estimates following the classical approach (based on the definition of seismotectonic source zones), cover the median values estimated following the fault-based approach. On the contrary, the fault-based approach in this region is still affected by a high degree of uncertainty, because of the poor constraints on the 3D geometries of the faults and the high uncertainties in their slip rates.

  3. How reliable are the risk estimates for X-ray examinations in forensic age estimations? A safety update.

    PubMed

    Ramsthaler, F; Proschek, P; Betz, W; Verhoff, M A

    2009-05-01

    Possible biological side effects of exposure to X-rays are stochastic effects such as carcinogenesis and genetic alterations. In recent years, a number of new studies have been published about the special cancer risk that children may suffer from diagnostic X-rays. Children and adolescents who constitute many of the probands in forensic age-estimation proceedings are considerably more sensitive to the carcinogenic risks of ionizing radiation than adults. Established doses for X-ray examinations in forensic age estimations vary from less than 0.1 microSv (left hand X-ray) up to more than 800 microSv (computed tomography). Computed tomography in children, as a relatively high-dose procedure, is of particular interest because the doses involved are near to the lower limit of the doses observed and analyzed in A-bombing survivor studies. From these studies, direct epidemiological data exist concerning the lifetime cancer risk. Since there is no medical indication for forensic age examinations, it should be stressed that only safe methods are generally acceptable. This paper reviews current knowledge on cancer risks associated with diagnostic radiation and aims to help forensic experts, dentists, and pediatricians evaluate the risk from radiation when using X-rays in age-estimation procedures. PMID:19153756

  4. Improved seismic risk estimation for Bucharest, based on multiple hazard scenarios, analytical methods and new techniques

    NASA Astrophysics Data System (ADS)

    Toma-Danila, Dragos; Florinela Manea, Elena; Ortanza Cioflan, Carmen

    2014-05-01

    Bucharest, capital of Romania (with 1678000 inhabitants in 2011), is one of the most exposed big cities in Europe to seismic damage. The major earthquakes affecting the city have their origin in the Vrancea region. The Vrancea intermediate-depth source generates, statistically, 2-3 shocks with moment magnitude >7.0 per century. Although the focal distance is greater than 170 km, the historical records (from the 1838, 1894, 1908, 1940 and 1977 events) reveal severe effects in the Bucharest area, e.g. intensities IX (MSK) for the case of 1940 event. During the 1977 earthquake, 1420 people were killed and 33 large buildings collapsed. The nowadays building stock is vulnerable both due to construction (material, age) and soil conditions (high amplification, generated within the weak consolidated Quaternary deposits, their thickness is varying 250-500m throughout the city). A number of 373 old buildings, out of 2563, evaluated by experts are more likely to experience severe damage/collapse in the next major earthquake. The total number of residential buildings, in 2011, was 113900. In order to guide the mitigation measures, different studies tried to estimate the seismic risk of Bucharest, in terms of buildings, population or economic damage probability. Unfortunately, most of them were based on incomplete sets of data, whether regarding the hazard or the building stock in detail. However, during the DACEA Project, the National Institute for Earth Physics, together with the Technical University of Civil Engineering Bucharest and NORSAR Institute managed to compile a database for buildings in southern Romania (according to the 1999 census), with 48 associated capacity and fragility curves. Until now, the developed real-time estimation system was not implemented for Bucharest. This paper presents more than an adaptation of this system to Bucharest; first, we analyze the previous seismic risk studies, from a SWOT perspective. This reveals that most of the studies don't use

  5. Medical Updates Number 5 to the International Space Station Probability Risk Assessment (PRA) Model Using the Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Butler, Doug; Bauman, David; Johnson-Throop, Kathy

    2011-01-01

    The Integrated Medical Model (IMM) Project has been developing a probabilistic risk assessment tool, the IMM, to help evaluate in-flight crew health needs and impacts to the mission due to medical events. This package is a follow-up to a data package provided in June 2009. The IMM currently represents 83 medical conditions and associated ISS resources required to mitigate medical events. IMM end state forecasts relevant to the ISS PRA model include evacuation (EVAC) and loss of crew life (LOCL). The current version of the IMM provides the basis for the operational version of IMM expected in the January 2011 timeframe. The objectives of this data package are: 1. To provide a preliminary understanding of medical risk data used to update the ISS PRA Model. The IMM has had limited validation and an initial characterization of maturity has been completed using NASA STD 7009 Standard for Models and Simulation. The IMM has been internally validated by IMM personnel but has not been validated by an independent body external to the IMM Project. 2. To support a continued dialogue between the ISS PRA and IMM teams. To ensure accurate data interpretation, and that IMM output format and content meets the needs of the ISS Risk Management Office and ISS PRA Model, periodic discussions are anticipated between the risk teams. 3. To help assess the differences between the current ISS PRA and IMM medical risk forecasts of EVAC and LOCL. Follow-on activities are anticipated based on the differences between the current ISS PRA medical risk data and the latest medical risk data produced by IMM.

  6. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  7. Estimated probability density functions for the times between flashes in the storms of 12 September 1975, 26 August 1975, and 13 July 1976

    NASA Technical Reports Server (NTRS)

    Tretter, S. A.

    1977-01-01

    A report is given to supplement the progress report of June 17, 1977. In that progress report gamma, lognormal, and Rayleigh probability density functions were fitted to the times between lightning flashes in the storms of 9/12/75, 8/26/75, and 7/13/76 by the maximum likelihood method. The goodness of fit is checked by the Kolmogoroff-Smirnoff test. Plots of the estimated densities along with normalized histograms are included to provide a visual check on the goodness of fit. The lognormal densities are the most peaked and have the highest tails. This results in the best fit to the normalized histogram in most cases. The Rayleigh densities have too broad and rounded peaks to give good fits. In addition, they have the lowest tails. The gamma densities fall inbetween and give the best fit in a few cases.

  8. Hate crimes and stigma-related experiences among sexual minority adults in the United States: prevalence estimates from a national probability sample.

    PubMed

    Herek, Gregory M

    2009-01-01

    Using survey responses collected via the Internet from a U.S. national probability sample of gay, lesbian, and bisexual adults (N = 662), this article reports prevalence estimates of criminal victimization and related experiences based on the target's sexual orientation. Approximately 20% of respondents reported having experienced a person or property crime based on their sexual orientation; about half had experienced verbal harassment, and more than 1 in 10 reported having experienced employment or housing discrimination. Gay men were significantly more likely than lesbians or bisexuals to experience violence and property crimes. Employment and housing discrimination were significantly more likely among gay men and lesbians than among bisexual men and women. Implications for future research and policy