Sample records for probabilistic exposure factor

  1. PROBABILISTIC MODELING FOR ADVANCED HUMAN EXPOSURE ASSESSMENT

    EPA Science Inventory

    Human exposures to environmental pollutants widely vary depending on the emission patterns that result in microenvironmental pollutant concentrations, as well as behavioral factors that determine the extent of an individual's contact with these pollutants. Probabilistic human exp...

  2. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  3. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments

    PubMed Central

    Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.

    2018-01-01

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804

  4. Comparative Probabilistic Assessment of Occupational Pesticide Exposures Based on Regulatory Assessments.

    PubMed

    Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A

    2017-11-06

    Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.

  5. INTEGRATED PROBABILISTIC AND DETERMINISTIC MODELING TECHNIQUES IN ESTIMATING EXPOSURE TO WATER-BORNE CONTAMINANTS: PART 1 EXPOSURE MODELING

    EPA Science Inventory

    Exposure to contaminants originating in the domestic water supply is influenced by a number of factors, including human activities, water use behavior, and physical and chemical processes. The key role of human activities is very apparent in exposure related to volatile water-...

  6. The benefits of probabilistic exposure assessment: three case studies involving contaminated air, water, and soil.

    PubMed

    Finley, B; Paustenbach, D

    1994-02-01

    Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard "point" risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.

  7. Feasibility study on the use of probabilistic migration modeling in support of exposure assessment from food contact materials.

    PubMed

    Poças, Maria F; Oliveira, Jorge C; Brandsch, Rainer; Hogg, Timothy

    2010-07-01

    The use of probabilistic approaches in exposure assessments of contaminants migrating from food packages is of increasing interest but the lack of concentration or migration data is often referred as a limitation. Data accounting for the variability and uncertainty that can be expected in migration, for example, due to heterogeneity in the packaging system, variation of the temperature along the distribution chain, and different time of consumption of each individual package, are required for probabilistic analysis. The objective of this work was to characterize quantitatively the uncertainty and variability in estimates of migration. A Monte Carlo simulation was applied to a typical solution of the Fick's law with given variability in the input parameters. The analysis was performed based on experimental data of a model system (migration of Irgafos 168 from polyethylene into isooctane) and illustrates how important sources of variability and uncertainty can be identified in order to refine analyses. For long migration times and controlled conditions of temperature the affinity of the migrant to the food can be the major factor determining the variability in the migration values (more than 70% of variance). In situations where both the time of consumption and temperature can vary, these factors can be responsible, respectively, for more than 60% and 20% of the variance in the migration estimates. The approach presented can be used with databases from consumption surveys to yield a true probabilistic estimate of exposure.

  8. Mixing zone and drinking water intake dilution factor and wastewater generation distributions to enable probabilistic assessment of down-the-drain consumer product chemicals in the U.S.

    PubMed

    Kapo, Katherine E; McDonough, Kathleen; Federle, Thomas; Dyer, Scott; Vamshi, Raghu

    2015-06-15

    Environmental exposure and associated ecological risk related to down-the-drain chemicals discharged by municipal wastewater treatment plants (WWTPs) are strongly influenced by in-stream dilution of receiving waters which varies by geography, flow conditions and upstream wastewater inputs. The iSTREEM® model (American Cleaning Institute, Washington D.C.) was utilized to determine probabilistic distributions for no decay and decay-based dilution factors in mean annual and low (7Q10) flow conditions. The dilution factors derived in this study are "combined" dilution factors which account for both hydrologic dilution and cumulative upstream effluent contributions that will differ depending on the rate of in-stream decay due to biodegradation, volatilization, sorption, etc. for the chemical being evaluated. The median dilution factors estimated in this study (based on various in-stream decay rates from zero decay to a 1h half-life) for WWTP mixing zones dominated by domestic wastewater flow ranged from 132 to 609 at mean flow and 5 to 25 at low flow, while median dilution factors at drinking water intakes (mean flow) ranged from 146 to 2×10(7) depending on the in-stream decay rate. WWTPs within the iSTREEM® model were used to generate a distribution of per capita wastewater generated in the U.S. The dilution factor and per capita wastewater generation distributions developed by this work can be used to conduct probabilistic exposure assessments for down-the-drain chemicals in influent wastewater, wastewater treatment plant mixing zones and at drinking water intakes in the conterminous U.S. In addition, evaluation of types and abundance of U.S. wastewater treatment processes provided insight into treatment trends and the flow volume treated by each type of process. Moreover, removal efficiencies of chemicals can differ by treatment type. Hence, the availability of distributions for per capita wastewater production, treatment type, and dilution factors at a national level provides a series of practical and powerful tools for building probabilistic exposure models. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  10. A PROBABILISTIC POPULATION EXPOSURE MODEL FOR PM10 AND PM 2.5

    EPA Science Inventory

    A first generation probabilistic population exposure model for Particulate Matter (PM), specifically for predicting PM10, and PM2.5, exposures of an urban, population has been developed. This model is intended to be used to predict exposure (magnitude, frequency, and duration) ...

  11. ANALYSIS OF CONCORDANCE OF PROBABILISTIC AGGREGATE EXPOSURE PREDICTIONS WITH OBSERVED BIOMONITORING RESULTS: AN EXAMPLE USING CTEPP DATA

    EPA Science Inventory

    Three key areas of scientific inquiry in the study of human exposure to environmental contaminants are 1) assessment of aggregate (i.e., multi-pathway, multi-route) exposures, 2) application of probabilistic methods to exposure prediction, and 3) the interpretation of biomarker m...

  12. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  13. Impact of refining the assessment of dietary exposure to cadmium in the European adult population.

    PubMed

    Ferrari, Pietro; Arcella, Davide; Heraud, Fanny; Cappé, Stefano; Fabiansson, Stefan

    2013-01-01

    Exposure assessment constitutes an important step in any risk assessment of potentially harmful substances present in food. The European Food Safety Authority (EFSA) first assessed dietary exposure to cadmium in Europe using a deterministic framework, resulting in mean values of exposure in the range of health-based guidance values. Since then, the characterisation of foods has been refined to better match occurrence and consumption data, and a new strategy to handle left-censoring in occurrence data was devised. A probabilistic assessment was performed and compared with deterministic estimates, using occurrence values at the European level and consumption data from 14 national dietary surveys. Mean estimates in the probabilistic assessment ranged from 1.38 (95% CI = 1.35-1.44) to 2.08 (1.99-2.23) µg kg⁻¹ bodyweight (bw) week⁻¹ across the different surveys, which were less than 10% lower than deterministic (middle bound) mean values that ranged from 1.50 to 2.20 µg kg⁻¹ bw week⁻¹. Probabilistic 95th percentile estimates of dietary exposure ranged from 2.65 (2.57-2.72) to 4.99 (4.62-5.38) µg kg⁻¹ bw week⁻¹, which were, with the exception of one survey, between 3% and 17% higher than middle-bound deterministic estimates. Overall, the proportion of subjects exceeding the tolerable weekly intake of 2.5 µg kg⁻¹ bw ranged from 14.8% (13.6-16.0%) to 31.2% (29.7-32.5%) according to the probabilistic assessment. The results of this work indicate that mean values of dietary exposure to cadmium in the European population were of similar magnitude using determinist or probabilistic assessments. For higher exposure levels, probabilistic estimates were almost consistently larger than deterministic counterparts, thus reflecting the impact of using the full distribution of occurrence values to determine exposure levels. It is considered prudent to use probabilistic methodology should exposure estimates be close to or exceeding health-based guidance values.

  14. A PROBABILISTIC EXPOSURE ASSESSMENT FOR CHILDREN WHO CONTACT CCA-TREATED PLAYSETS AND DECKS USING THE STOCHASTIC HUMAN EXPOSURE AND DOSE SIMULATION (SHEDS) MODEL FOR THE WOOD PRESERVATIVE EXPOSURE SCENARIO

    EPA Science Inventory

    The U.S. Environmental Protection Agency has conducted a probabilistic exposure and dose assessment on the arsenic (As) and chromium (Cr) components of Chromated Copper Arsenate (CCA) using the Stochastic Human Exposure and Dose Simulation model for wood preservatives (SHEDS-Wood...

  15. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    EPA Science Inventory

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  16. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  17. Probabilistic Reverse dOsimetry Estimating Exposure Distribution (PROcEED)

    EPA Pesticide Factsheets

    PROcEED is a web-based application used to conduct probabilistic reverse dosimetry calculations.The tool is used for estimating a distribution of exposure concentrations likely to have produced biomarker concentrations measured in a population.

  18. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PROBABILISTIC APPROACH OF EXPOSURE CALCULATION OF DERMAL EXPOSURE (IIT-A-13.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate the dermal exposure using a probabilistic approach. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University of Arizona NHEXAS and Battelle Labo...

  19. Assessing the inhalation cancer risk of particulate matter bound polycyclic aromatic hydrocarbons (PAHs) for the elderly in a retirement community of a mega city in North China.

    PubMed

    Han, Bin; Liu, Yating; You, Yan; Xu, Jia; Zhou, Jian; Zhang, Jiefeng; Niu, Can; Zhang, Nan; He, Fei; Ding, Xiao; Bai, Zhipeng

    2016-10-01

    Assessment of the health risks resulting from exposure to ambient polycyclic aromatic hydrocarbons (PAHs) is limited by the lack of environmental exposure data among different subpopulations. To assess the exposure cancer risk of particulate carcinogenic polycyclic aromatic hydrocarbon pollution for the elderly, this study conducted a personal exposure measurement campaign for particulate PAHs in a community of Tianjin, a city in northern China. Personal exposure samples were collected from the elderly in non-heating (August-September, 2009) and heating periods (November-December, 2009), and 12 PAHs individuals were analyzed for risk estimation. Questionnaire and time-activity log were also recorded for each person. The probabilistic risk assessment model was integrated with Toxic Equivalent Factors (TEFs). Considering that the estimation of the applied dose for a given air pollutant is dependent on the inhalation rate, the inhalation rate from both EPA exposure factor book was applied to calculate the carcinogenic risk in this study. Monte Carlo simulation was used as a probabilistic risk assessment model, and risk simulation results indicated that the inhalation-ILCR values for both male and female subjects followed a lognormal distribution with a mean of 4.81 × 10 -6 and 4.57 × 10 -6 , respectively. Furthermore, the 95 % probability lung cancer risks were greater than the USEPA acceptable level of 10 -6 for both men and women through the inhalation route, revealing that exposure to PAHs posed an unacceptable potential cancer risk for the elderly in this study. As a result, some measures should be taken to reduce PAHs pollution and the exposure level to decrease the cancer risk for the general population, especially for the elderly.

  20. EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS

    EPA Science Inventory

    In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...

  1. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  2. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PROBABILISTIC APPROACH FOR CALCULATING INGESTION EXPOSURE FROM DAY 4 COMPOSITE MEASUREMENTS, THE DIRECT METHOD OF EXPOSURE ESTIMATION (IIT-A-15.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate the ingestion exposure using composite food chemical residue values from the day of direct measurements. The calculation is based on the probabilistic approach. This SOP uses data that have been proper...

  3. Comparison of Four Probabilistic Models (CARES, Calendex, ConsEspo, SHEDS) to Estimate Aggregate Residential Exposures to Pesticides

    EPA Science Inventory

    Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...

  4. EXPERIENCES WITH USING PROBABILISTIC EXPOSURE ANALYSIS METHODS IN THE U.S. EPA

    EPA Science Inventory

    Over the past decade various Offices and Programs within the U.S. EPA have either initiated or increased the development and application of probabilistic exposure analysis models. These models have been applied to a broad range of research or regulatory problems in EPA, such as e...

  5. Probabilistic exposure assessment to face and oral care cosmetic products by the French population.

    PubMed

    Bernard, A; Dornic, N; Roudot, Ac; Ficheux, As

    2018-01-01

    Cosmetic exposure data for face and mouth are limited in Europe. The aim of the study was to assess the exposure to face cosmetics using recent French consumption data (Ficheux et al., 2016b, 2015). Exposure was assessed using a probabilistic method for thirty one face products from four lines of products: cleanser, care, make-up and make-up remover products and two oral care products. Probabilistic exposure was assessed for different subpopulation according to sex and age in adults and children. Pregnant women were also studied. The levels of exposure to moisturizing cream, lip balm, mascara, eyeliner, cream foundation, toothpaste and mouthwash were higher than the values currently used by the Scientific Committee on Consumer Safety (SCCS). Exposure values found for eye shadow, lipstick, lotion and milk (make-up remover) were lower than SCCS values. These new French exposure values will be useful for safety assessors and for safety agencies in order to protect the general population and the at risk populations. Copyright © 2017. Published by Elsevier Ltd.

  6. QUANTIFYING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN USING A PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...

  7. Frontal and Parietal Contributions to Probabilistic Association Learning

    PubMed Central

    Rushby, Jacqueline A.; Vercammen, Ans; Loo, Colleen; Short, Brooke

    2011-01-01

    Neuroimaging studies have shown both dorsolateral prefrontal (DLPFC) and inferior parietal cortex (iPARC) activation during probabilistic association learning. Whether these cortical brain regions are necessary for probabilistic association learning is presently unknown. Participants' ability to acquire probabilistic associations was assessed during disruptive 1 Hz repetitive transcranial magnetic stimulation (rTMS) of the left DLPFC, left iPARC, and sham using a crossover single-blind design. On subsequent sessions, performance improved relative to baseline except during DLPFC rTMS that disrupted the early acquisition beneficial effect of prior exposure. A second experiment examining rTMS effects on task-naive participants showed that neither DLPFC rTMS nor sham influenced naive acquisition of probabilistic associations. A third experiment examining consecutive administration of the probabilistic association learning test revealed early trial interference from previous exposure to different probability schedules. These experiments, showing disrupted acquisition of probabilistic associations by rTMS only during subsequent sessions with an intervening night's sleep, suggest that the DLPFC may facilitate early access to learned strategies or prior task-related memories via consolidation. Although neuroimaging studies implicate DLPFC and iPARC in probabilistic association learning, the present findings suggest that early acquisition of the probabilistic cue-outcome associations in task-naive participants is not dependent on either region. PMID:21216842

  8. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  9. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PROBABILISTIC APPROACH FOR ESTIMATING INHALATION EXPOSURES TO CHLORPYRIFOS AND DIAZINON (IIT-A-14.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate the inhalation exposures to chlorpyrifos and diazinon using the probabilistic approach. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University...

  10. Cognitive Development Effects of Teaching Probabilistic Decision Making to Middle School Students

    ERIC Educational Resources Information Center

    Mjelde, James W.; Litzenberg, Kerry K.; Lindner, James R.

    2011-01-01

    This study investigated the comprehension and effectiveness of teaching formal, probabilistic decision-making skills to middle school students. Two specific objectives were to determine (1) if middle school students can comprehend a probabilistic decision-making approach, and (2) if exposure to the modeling approaches improves middle school…

  11. Aggregate exposure approaches for parabens in personal care products: a case assessment for children between 0 and 3 years old

    PubMed Central

    Gosens, Ilse; Delmaar, Christiaan J E; ter Burg, Wouter; de Heer, Cees; Schuur, A Gerlienke

    2014-01-01

    In the risk assessment of chemical substances, aggregation of exposure to a substance from different sources via different pathways is not common practice. Focusing the exposure assessment on a substance from a single source can lead to a significant underestimation of the risk. To gain more insight on how to perform an aggregate exposure assessment, we applied a deterministic (tier 1) and a person-oriented probabilistic approach (tier 2) for exposure to the four most common parabens through personal care products in children between 0 and 3 years old. Following a deterministic approach, a worst-case exposure estimate is calculated for methyl-, ethyl-, propyl- and butylparaben. As an illustration for risk assessment, Margins of Exposure (MoE) are calculated. These are 991 and 4966 for methyl- and ethylparaben, and 8 and 10 for propyl- and butylparaben, respectively. In tier 2, more detailed information on product use has been obtained from a small survey on product use of consumers. A probabilistic exposure assessment is performed to estimate the variability and uncertainty of exposure in a population. Results show that the internal exposure for each paraben is below the level determined in tier 1. However, for propyl- and butylparaben, the percentile of the population with an exposure probability above the assumed “safe” MoE of 100, is 13% and 7%, respectively. In conclusion, a tier 1 approach can be performed using simple equations and default point estimates, and serves as a starting point for exposure and risk assessment. If refinement is warranted, the more data demanding person-oriented probabilistic approach should be used. This probabilistic approach results in a more realistic exposure estimate, including the uncertainty, and allows determining the main drivers of exposure. Furthermore, it allows to estimate the percentage of the population for which the exposure is likely to be above a specific value. PMID:23801276

  12. Rats bred for high alcohol drinking are more sensitive to delayed and probabilistic outcomes.

    PubMed

    Wilhelm, C J; Mitchell, S H

    2008-10-01

    Alcoholics and heavy drinkers score higher on measures of impulsivity than nonalcoholics and light drinkers. This may be because of factors that predate drug exposure (e.g. genetics). This study examined the role of genetics by comparing impulsivity measures in ethanol-naive rats selectively bred based on their high [high alcohol drinking (HAD)] or low [low alcohol drinking (LAD)] consumption of ethanol. Replicates 1 and 2 of the HAD and LAD rats, developed by the University of Indiana Alcohol Research Center, completed two different discounting tasks. Delay discounting examines sensitivity to rewards that are delayed in time and is commonly used to assess 'choice' impulsivity. Probability discounting examines sensitivity to the uncertain delivery of rewards and has been used to assess risk taking and risk assessment. High alcohol drinking rats discounted delayed and probabilistic rewards more steeply than LAD rats. Discount rates associated with probabilistic and delayed rewards were weakly correlated, while bias was strongly correlated with discount rate in both delay and probability discounting. The results suggest that selective breeding for high alcohol consumption selects for animals that are more sensitive to delayed and probabilistic outcomes. Sensitivity to delayed or probabilistic outcomes may be predictive of future drinking in genetically predisposed individuals.

  13. A PROBABILISTIC ARSENIC EXPOSURE ASSESSMENT FOR CHILDREN WHO CONTACT CHROMATED COPPER ARSENATE ( CAA )-TREATED PLAYSETS AND DECKS: PART 2 SENSITIVITY AND UNCERTAINTY ANALYSIS

    EPA Science Inventory

    A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two part paper. This Part 2 paper discusses sensitivity and uncertainty analyses conducted to assess the key m...

  14. Assessment of food intake input distributions for use in probabilistic exposure assessments of food additives.

    PubMed

    Gilsenan, M B; Lambe, J; Gibney, M J

    2003-11-01

    A key component of a food chemical exposure assessment using probabilistic analysis is the selection of the most appropriate input distribution to represent exposure variables. The study explored the type of parametric distribution that could be used to model variability in food consumption data likely to be included in a probabilistic exposure assessment of food additives. The goodness-of-fit of a range of continuous distributions to observed data of 22 food categories expressed as average daily intakes among consumers from the North-South Ireland Food Consumption Survey was assessed using the BestFit distribution fitting program. The lognormal distribution was most commonly accepted as a plausible parametric distribution to represent food consumption data when food intakes were expressed as absolute intakes (16/22 foods) and as intakes per kg body weight (18/22 foods). Results from goodness-of-fit tests were accompanied by lognormal probability plots for a number of food categories. The influence on food additive intake of using a lognormal distribution to model food consumption input data was assessed by comparing modelled intake estimates with observed intakes. Results from the present study advise some level of caution about the use of a lognormal distribution as a mode of input for food consumption data in probabilistic food additive exposure assessments and the results highlight the need for further research in this area.

  15. A simulation-based probabilistic design method for arctic sea transport systems

    NASA Astrophysics Data System (ADS)

    Martin, Bergström; Ove, Erikstad Stein; Sören, Ehlers

    2016-12-01

    When designing an arctic cargo ship, it is necessary to consider multiple stochastic factors. This paper evaluates the merits of a simulation-based probabilistic design method specifically developed to deal with this challenge. The outcome of the paper indicates that the incorporation of simulations and probabilistic design parameters into the design process enables more informed design decisions. For instance, it enables the assessment of the stochastic transport capacity of an arctic ship, as well as of its long-term ice exposure that can be used to determine an appropriate level of ice-strengthening. The outcome of the paper also indicates that significant gains in transport system cost-efficiency can be obtained by extending the boundaries of the design task beyond the individual vessel. In the case of industrial shipping, this allows for instance the consideration of port-based cargo storage facilities allowing for temporary shortages in transport capacity and thus a reduction in the required fleet size / ship capacity.

  16. Modifications of exposure to ambient particulate matter: Tackling bias in using ambient concentration as surrogate with particle infiltration factor and ambient exposure factor.

    PubMed

    Shi, Shanshan; Chen, Chen; Zhao, Bin

    2017-01-01

    Numerous epidemiological studies explored health risks attributed to outdoor particle pollution. However, a number of these studies routinely utilized ambient concentration as a surrogate for personal exposure to ambient particles. This simplification ignored the difference between indoor and outdoor concentrations of outdoor originated particles and may bias the estimate of particle-health associations. Intending to avoid the bias, particle infiltration factor (F inf ), which describes the penetration of outdoor particles in indoor environment, and ambient exposure factor (α), which represents the fraction of outdoor particles people are truly exposed to, are utilized as modification factors to modify outdoor particle concentration. In this study, the probabilistic distributions of annually-averaged and seasonally-averaged F inf and α were assessed for residences and residents in Beijing. F inf of a single residence and α of an individual was estimated based on the mechanisms governing particle outdoor-to-indoor migration and human time-activity pattern. With this as the core deterministic model, probabilistic distributions of F inf and α were estimated via Monte Carlo Simulation. Annually-averaged F inf of PM 2.5 and PM 10 for residences in Beijing tended to be log-normally distributed as lnN(-0.74,0.14) and lnN(-0.94,0.15) with geometric mean value as 0.47 and 0.39, respectively. Annually-averaged α of PM 2.5 and PM 10 for Beijing residents also tended to be log-normally distributed as lnN(-0.59,0.12) and lnN(-0.73,0.13) with geometric mean value as 0.55 and 0.48, respectively. As for seasonally-averaged results, F inf and α of PM 2.5 and PM 10 were largest in summer and smallest in winter. The obvious difference between these modification factors and unity suggested that modifications of ambient particle concentration need to be considered in epidemiological studies to avoid misclassifications of personal exposure to ambient particles. Moreover, considering the inter-individual difference of F inf and α may lead to a brand new perspective of particle-health associations in further epidemiological study. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Exposure assessment of 3-monochloropropane-1, 2-diol esters from edible oils and fats in China.

    PubMed

    Li, Chang; Nie, Shao-Ping; Zhou, Yong-Qiang; Xie, Ming-Yong

    2015-01-01

    3-monochoropropane-1, 2-diol (3-MCPD) esters from edible oils are considered to be a possible risk factor for adverse effects in human. In the present study, the exposure assessment of 3-MCPD esters to Chinese population was performed. A total of 143 edible oil and fat samples collected from Chinese markets were determined for the concentrations of 3-MCPD esters. The concentration data together with the data of fats consumed were analyzed by the point evaluation and probabilistic assessment for the exposure assessment. The point evaluation showed that the mean daily intake (DI) of 3-MCPD esters were lower than the value of provisional maximum tolerable daily intake (PMTDI) of 3-MCPD (2 µg/kg BW/d). The mean DI values in different age groups obtained from probabilistic assessment were similar to the results of the point evaluation. However, in high percentiles (95th, 97.5th, 99th), the DI values in all age groups were undesirably higher than the value of PMTDI. Overall, the children and adolescents exposed more to 3-MCPD esters than the adults. Uncertainty was also analyzed for the exposure assessment. Decreasing the level of 3-MCPD esters in edible oils and consuming less oil were top priority to minimize the risk of 3-MCPD esters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  19. POPULATION EXPOSURES TO PARTICULATE MATTER: A COMPARISON OF EXPOSURE MODEL PREDICTIONS AND MEASUREMENT DATA

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) is currently developing an integrated human exposure source-to-dose modeling system (HES2D). This modeling system will incorporate models that use a probabilistic approach to predict population exposures to environmental ...

  20. Probabilistic modelling of human exposure to intense sweeteners in Italian teenagers: validation and sensitivity analysis of a probabilistic model including indicators of market share and brand loyalty.

    PubMed

    Arcella, D; Soggiu, M E; Leclercq, C

    2003-10-01

    For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.

  1. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and resultant loss of income produces widespread default on payments. With increased computational power and more complete inventories of exposure, Monte Carlo methods may provide more accurate estimation of severe losses and the opportunity to increase resilience of vulnerable systems and communities.

  2. A POPULATION EXPOSURE MODEL FOR PARTICULATE MATTER: SHEDS-PM

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) has developed a population exposure and dose model for particulate matter (PM) that will be publicly available in Fall 2002. The Stochastic Human Exposure and Dose Simulation (SHEDS-PM) model uses a probabilistic approach ...

  3. Exposure Assessment Tools by Tiers and Types - Deterministic and Probabilistic Assessments

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  4. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  5. EXPOSURE TO PESTICIDES BY MEDIUM AND ROUTE: THE 90TH PERCENTILE AND RELATED UNCERTAINTIES

    EPA Science Inventory

    This study investigates distributions of exposure to chlorpyrifos and diazinon using the database generated in the state of Arizona by the National Human Exposure Assessment Survey (NHEXAS-AZ). Exposure to pesticide and associated uncertainties are estimated using probabilistic...

  6. Using a probabilistic approach in an ecological risk assessment simulation tool: test case for depleted uranium (DU).

    PubMed

    Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A

    2005-06-01

    A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.

  7. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach

    PubMed Central

    Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.

    2014-01-01

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514

  8. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    PubMed

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  9. Stochastic Human Exposure and Dose Simulation Model for Pesticides

    EPA Science Inventory

    SHEDS-Pesticides (Stochastic Human Exposure and Dose Simulation Model for Pesticides) is a physically-based stochastic model developed to quantify exposure and dose of humans to multimedia, multipathway pollutants. Probabilistic inputs are combined in physical/mechanistic algorit...

  10. NEXT GENERATION MULTIMEDIA/MULTIPATHWAY EXPOSURE MODELING

    EPA Science Inventory

    The Stochastic Human Exposure and Dose Simulation model for pesticides (SHEDS-Pesticides) supports the efforts of EPA to better understand human exposures and doses to multimedia, multipathway pollutants. It is a physically-based, probabilistic computer model that predicts, for u...

  11. Probabilistic dietary exposure assessment taking into account variability in both amount and frequency of consumption.

    PubMed

    Slob, Wout

    2006-07-01

    Probabilistic dietary exposure assessments that are fully based on Monte Carlo sampling from the raw intake data may not be appropriate. This paper shows that the data should first be analysed by using a statistical model that is able to take the various dimensions of food consumption patterns into account. A (parametric) model is discussed that takes into account the interindividual variation in (daily) consumption frequencies, as well as in amounts consumed. Further, the model can be used to include covariates, such as age, sex, or other individual attributes. Some illustrative examples show how this model may be used to estimate the probability of exceeding an (acute or chronic) exposure limit. These results are compared with the results based on directly counting the fraction of observed intakes exceeding the limit value. This comparison shows that the latter method is not adequate, in particular for the acute exposure situation. A two-step approach for probabilistic (acute) exposure assessment is proposed: first analyse the consumption data by a (parametric) statistical model as discussed in this paper, and then use Monte Carlo techniques for combining the variation in concentrations with the variation in consumption (by sampling from the statistical model). This approach results in an estimate of the fraction of the population as a function of the fraction of days at which the exposure limit is exceeded by the individual.

  12. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    PubMed

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  13. PREDICTING POPULATION EXPOSURES TO PM: THE IMPORTANCE OF MICROENVIRONMENTAL CONCENTRATIONS AND HUMAN ACTIVITIES

    EPA Science Inventory

    The Stochastic Human Exposure and Dose Simulation (SHEDS) models being developed by the US EPA/NERL use a probabilistic approach to predict population exposures to pollutants. The SHEDS model for particulate matter (SHEDS-PM) estimates the population distribution of PM exposure...

  14. THE CONTRIBUTION OF AMBIENT PM2.5 TO TOTAL PERSONAL EXPOSURES: RESULTS FROM A POPULATION EXPOSURE MODEL FOR PHILADELPHIA, PA

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) is currently developing an integrated human exposure source-to-dose modeling system (HES2D). This modeling system will incorporate population exposure modules that use a probabilistic approach to predict population exposu...

  15. SHEDS-PM: A POPULATION EXPOSURE MODEL FOR PREDICTING DISTRIBUTIONS OF PM EXPOSURE AND DOSE FROM BOTH OUTDOOR AND INDOOR SOURCES

    EPA Science Inventory

    The US EPA National Exposure Research Laboratory (NERL) has developed a population exposure and dose model for particulate matter (PM), called the Stochastic Human Exposure and Dose Simulation (SHEDS) model. SHEDS-PM uses a probabilistic approach that incorporates both variabi...

  16. MODELING ENVIRONMENTAL EXPOSURES TO PARTICULATE MATTER AND PESTICIDES

    EPA Science Inventory

    This presentation describes initial results from on-going research at EPA on modeling human exposures to particulate matter and residential pesticides. A first generation probabilistic population exposure model for Particulate Matter (PM), specifically for predicting PM1o and P...

  17. INTEGRATED PROBABILISTIC AND DETERMINISTIC MODELING TECHNIQUES IN ESTIMATING EXPOSURE TO WATER-BORNE CONTAMINANTS: PART 2 PHARMACOKINETIC MODELING

    EPA Science Inventory

    The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...

  18. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  19. MODELING HUMAN EXPOSURES AND DOSE USING A 2-DIMENSIONAL MONTE-CARLO MODEL (SHEDS)

    EPA Science Inventory

    Since 1998, US EPA's National Exposure Research Laboratory (NERL) has been developing the Stochastic Human Exposure and Dose Simulation (SHEDS) model for various classes of pollutants. SHEDS is a physically-based probabilistic model intended for improving estimates of human ex...

  20. High Throughput Exposure Prioritization of Chemicals Using a Screening-Level Probabilistic SHEDS-Lite Exposure Model

    EPA Science Inventory

    These novel modeling approaches for screening, evaluating and classifying chemicals based on the potential for biologically-relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. The new modeling approach is derived from the Stocha...

  1. MODELING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN

    EPA Science Inventory

    To help address the aggregate exposure assessment needs of the Food Quality Protection Act, a physically-based probabilistic model (SHEDS-Pesticides, version 3) has been applied to estimate aggregate chlorpyrifos exposure and dose to children. Two age groups (0-4, 5-9 years) a...

  2. USEPA SHEDS MODEL: METHODOLOGY FOR EXPOSURE ASSESSMENT FOR WOOD PRESERVATIVES

    EPA Science Inventory

    A physically-based, Monte Carlo probabilistic model (SHEDS-Wood: Stochastic Human Exposure and Dose Simulation model for wood preservatives) has been applied to assess the exposure and dose of children to arsenic (As) and chromium (Cr) from contact with chromated copper arsenat...

  3. PM POPULATION EXPOSURE AND DOSE MODELS

    EPA Science Inventory

    The overall objective of this study is the development of a refined probabilistic exposure and dose model for particulate matter (PM) suitable for predicting PM10 and PM2.5 population exposures. This modeling research will be conducted both in-house by EPA scientists and through...

  4. MODELED ESTIMATES OF CHLORPYRIFOS EXPOSURE AND DOSE FOR THE MINNESOTA AND ARIZONA NHEXAS POPULATIONS

    EPA Science Inventory

    This paper presents a probabilistic, multimedia, multipathway exposure model and assessment for chlorpyrifos developed as part of the National Human Exposure Assessment Survey (NHEXAS). The model was constructed using available information prior to completion of the NHEXAS stu...

  5. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  6. SHEDS-HT: An Integrated Probabilistic Exposure Model for Prioritizing Exposures to Chemicals with Near-Field and Dietary Sources

    EPA Science Inventory

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologi...

  7. FURTHER REFINEMENTS AND TESTING OF APEX3.0: EPA'S POPULATION EXPOSURE MODEL FOR CRITERIA AND AIR TOXIC INHALATION

    EPA Science Inventory

    The Air Pollutants Exposure Model (APEX(3.0)) is a PC-based model that was derived from the probabilistic NAAQS Exposure Model for carbon monoxide (pNEM/CO). APEX will be one of the tools used to estimate human population exposure for criteria and air toxic pollutants as part ...

  8. The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals: Dietary Module Version 1: Technical Manual

    EPA Pesticide Factsheets

    SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.

  9. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    PubMed

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.

  10. Probabilistic Assessment of Cancer Risk for Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2009-01-01

    During future lunar missions, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon transit. NASA s new lunar program anticipates that up to 15% of crew time may be on EVA, with minimal radiation shielding. For the operational challenge to respond to events of unknown size and duration, a probabilistic risk assessment approach is essential for mission planning and design. Using the historical database of proton measurements during the past 5 solar cycles, a typical hazard function for SPE occurrence was defined using a non-homogeneous Poisson model as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions ranging from the 5th to 95th percentile of particle fluences for a specified mission period were simulated. Organ doses corresponding to particle fluences at the median and at the 95th percentile for a specified mission period were assessed using NASA s baryon transport model, BRYNTRN. The cancer fatality risk for astronauts as functions of age, gender, and solar cycle activity were then analyzed. The probability of exceeding the NASA 30- day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated. Future work will involve using this probabilistic risk assessment approach to SPE forecasting, combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  11. Probabilistic risk assessment of exposure to leucomalachite green residues from fish products.

    PubMed

    Chu, Yung-Lin; Chimeddulam, Dalaijamts; Sheen, Lee-Yan; Wu, Kuen-Yuh

    2013-12-01

    To assess the potential risk of human exposure to carcinogenic leucomalachite green (LMG) due to fish consumption, the probabilistic risk assessment was conducted for adolescent, adult and senior adult consumers in Taiwan. The residues of LMG with the mean concentration of 13.378±20.56 μg kg(-1) (BFDA, 2009) in fish was converted into dose, considering fish intake reported for three consumer groups by NAHSIT (1993-1996) and body weight of an average individual of the group. The lifetime average and high 95th percentile dietary intakes of LMG from fish consumption for Taiwanese consumers were estimated at up to 0.0135 and 0.0451 μg kg-bw(-1) day(-1), respectively. Human equivalent dose (HED) of 2.875 mg kg-bw(-1) day(-1) obtained from a lower-bound benchmark dose (BMDL10) in mice by interspecies extrapolation was linearly extrapolated to oral cancer slope factor (CSF) of 0.035 (mgkg-bw(-1)day(-1))(-1) for humans. Although, the assumptions and methods are different, the results of lifetime cancer risk varying from 3×10(-7) to 1.6×10(-6) were comparable to those of margin of exposures (MOEs) varying from 410,000 to 4,800,000. In conclusions, Taiwanese fish consumers with the 95th percentile LADD of LMG have greater risk of liver cancer and need to an action of risk management in Taiwan. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals: Residential Module Version 4: User Guide, June 2012

    EPA Pesticide Factsheets

    SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.

  13. The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals: Residential Module Version 4: Technical Manual, May 2012

    EPA Pesticide Factsheets

    SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.

  14. The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals: Dietary Module Version 1: User Guide, June 2012

    EPA Pesticide Factsheets

    SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.

  15. Developing an Event-Tree Probabilistic Tsunami Inundation Model for NE Atlantic Coasts: Application to a Case Study

    NASA Astrophysics Data System (ADS)

    Omira, R.; Matias, L.; Baptista, M. A.

    2016-12-01

    This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.

  16. Probabilistic Usage of the Multi-Factor Interaction Model

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A Multi-Factor Interaction Model (MFIM) is used to predict the insulating foam mass expulsion during the ascending of a space vehicle. The exponents in the MFIM are evaluated by an available approach which consists of least squares and an optimization algorithm. These results were subsequently used to probabilistically evaluate the effects of the uncertainties in each participating factor in the mass expulsion. The probabilistic results show that the surface temperature dominates at high probabilities and the pressure which causes the mass expulsion at low probabil

  17. Application of Probabilistic Modeling to Quantify the Reduction Levels of Hepatocellular Carcinoma Risk Attributable to Chronic Aflatoxins Exposure.

    PubMed

    Wambui, Joseph M; Karuri, Edward G; Ojiambo, Julia A; Njage, Patrick M K

    2017-01-01

    Epidemiological studies show a definite connection between areas of high aflatoxin content and a high occurrence of human hepatocellular carcinoma (HCC). Hepatitis B virus in individuals further increases the risk of HCC. The two risk factors are prevalent in rural Kenya and continuously predispose the rural populations to HCC. A quantitative cancer risk assessment therefore quantified the levels at which potential pre- and postharvest interventions reduce the HCC risk attributable to consumption of contaminated maize and groundnuts. The assessment applied a probabilistic model to derive probability distributions of HCC cases and percentage reductions levels of the risk from secondary data. Contaminated maize and groundnuts contributed to 1,847 ± 514 and 158 ± 52 HCC cases per annum, respectively. The total contribution of both foods to the risk was additive as it resulted in 2,000 ± 518 cases per annum. Consumption and contamination levels contributed significantly to the risk whereby lower age groups were most affected. Nonetheless, pre- and postharvest interventions might reduce the risk by 23.0-83.4% and 4.8-95.1%, respectively. Therefore, chronic exposure to aflatoxins increases the HCC risk in rural Kenya, but a significant reduction of the risk can be achieved by applying specific pre- and postharvest interventions.

  18. Accounting for pH heterogeneity and variability in modelling human health risks from cadmium in contaminated land.

    PubMed

    Gay, J Rebecca; Korre, Anna

    2009-07-01

    The authors have previously published a methodology which combines quantitative probabilistic human health risk assessment and spatial statistical methods (geostatistics) to produce an assessment, incorporating uncertainty, of risks to human health from exposure to contaminated land. The model assumes a constant soil to plant concentration factor (CF(veg)) when calculating intake of contaminants. This model is modified here to enhance its use in a situation where CF(veg) varies according to soil pH, as is the case for cadmium. The original methodology uses sequential indicator simulation (SIS) to map soil concentration estimates for one contaminant across a site. A real, age-stratified population is mapped across the contaminated area, and intake of soil contaminants by individuals is calculated probabilistically using an adaptation of the Contaminated Land Exposure Assessment (CLEA) model. The proposed improvement involves not only the geostatistical estimation of the contaminant concentration, but also that of soil pH, which in turn leads to a variable CF(veg) estimate which influences the human intake results. The results presented demonstrate that taking pH into account can influence the outcome of the risk assessment greatly. It is proposed that a similar adaptation could be used for other combinations of soil variables which influence CF(veg).

  19. Water body and riparian buffer strip characteristics in a vineyard area to support aquatic pesticide exposure assessment.

    PubMed

    Ohliger, Renja; Schulz, Ralf

    2010-10-15

    The implementation of a geodata-based probabilistic pesticide exposure assessment for surface waters in Germany offers the opportunity to base the exposure estimation on more differentiated assumptions including detailed landscape characteristics. Since these characteristics can only be estimated using field surveys, water body width and depth, hydrology, riparian buffer strip width, ground vegetation cover, existence of concentrated flow paths, and riparian vegetation were characterised at 104 water body segments in the vineyard region Palatinate (south-west Germany). Water body segments classified as permanent (n=43) had median values of water body width and depth of 0.9m and 0.06m, respectively, and the determined median width:depth ratio was 15. Thus, the deterministic water body model (width=1m; depth=0.3m) assumed in regulatory exposure assessment seems unsuitable for small water bodies in the study area. Only 25% of investigated buffer strips had a dense vegetation cover (>70%) and allow a laminar sheet flow as required to include them as an effective pesticide runoff reduction landscape characteristic. At 77 buffer strips, bordering field paths and erosion rills leading into the water body were present, concentrating pesticide runoff and consequently decreasing buffer strip efficiency. The vegetation type shrubbery (height>1.5m) was present at 57 (29%) investigated riparian buffer strips. According to their median optical vegetation density of 75%, shrubberies may provide a spray drift reduction of 72±29%. Implementing detailed knowledge in an overall assessment revealed that exposure via drift might be 2.4 and via runoff up to 1.6 fold higher than assumed by the deterministic approach. Furthermore, considering vegetated buffer strips only by their width leads to an underestimation of exposure by a factor of as much as four. Our data highlight that the deterministic model assumptions neither represent worst-case nor median values and therefore cannot simply be adopted in a probabilistic approach. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. PBDE exposure from food in Ireland: optimising data exploitation in probabilistic exposure modelling.

    PubMed

    Trudel, David; Tlustos, Christina; Von Goetz, Natalie; Scheringer, Martin; Hungerbühler, Konrad

    2011-01-01

    Polybrominated diphenyl ethers (PBDEs) are a class of brominated flame retardants added to plastics, polyurethane foam, electronics, textiles, and other products. These products release PBDEs into the indoor and outdoor environment, thus causing human exposure through food and dust. This study models PBDE dose distributions from ingestion of food for Irish adults on congener basis by using two probabilistic and one semi-deterministic method. One of the probabilistic methods was newly developed and is based on summary statistics of food consumption combined with a model generating realistic daily energy supply from food. Median (intermediate) doses of total PBDEs are in the range of 0.4-0.6 ng/kg(bw)/day for Irish adults. The 97.5th percentiles of total PBDE doses lie in a range of 1.7-2.2 ng/kg(bw)/day, which is comparable to doses derived for Belgian and Dutch adults. BDE-47 and BDE-99 were identified as the congeners contributing most to estimated intakes, accounting for more than half of the total doses. The most influential food groups contributing to this intake are lean fish and salmon which together account for about 22-25% of the total doses.

  1. Probabilistic assessment of wildfire hazard and municipal watershed exposure

    Treesearch

    Joe Scott; Don Helmbrecht; Matthew P. Thompson; David E. Calkin; Kate Marcille

    2012-01-01

    The occurrence of wildfires within municipal watersheds can result in significant impacts to water quality and ultimately human health and safety. In this paper, we illustrate the application of geospatial analysis and burn probability modeling to assess the exposure of municipal watersheds to wildfire. Our assessment of wildfire exposure consists of two primary...

  2. NORTH AMERICAN FREE TRADE AGREEMENT (NAFTA) BORDER PROJECT - A COMPARISON OF THE ARIZONA BORDER POPULATION WITH THE STATE POPULATION

    EPA Science Inventory

    There is a perception among the population of the border communities that they have increased exposure due to their proximity to pollution sources in Mexico. This study provides exposure data for the border population that will be compared with data from a probabilistic exposure...

  3. Other Perspectives for Developing Exposure Estimates: “SHEDS-Lite: Rapid Scenario-Based ExposurePredictions for Chemicals with Near-Field and Dietary Pathways”

    EPA Science Inventory

    Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. This presentation will describe the development of EPA’s screening-level, probabilistic SHEDS-Li...

  4. Cumulative dietary exposure to a selected group of pesticides of the triazole group in different European countries according to the EFSA guidance on probabilistic modelling.

    PubMed

    Boon, Polly E; van Donkersgoed, Gerda; Christodoulou, Despo; Crépet, Amélie; D'Addezio, Laura; Desvignes, Virginie; Ericsson, Bengt-Göran; Galimberti, Francesco; Ioannou-Kakouri, Eleni; Jensen, Bodil Hamborg; Rehurkova, Irena; Rety, Josselin; Ruprich, Jiri; Sand, Salomon; Stephenson, Claire; Strömberg, Anita; Turrini, Aida; van der Voet, Hilko; Ziegler, Popi; Hamey, Paul; van Klaveren, Jacob D

    2015-05-01

    The practicality was examined of performing a cumulative dietary exposure assessment according to the requirements of the EFSA guidance on probabilistic modelling. For this the acute and chronic cumulative exposure to triazole pesticides was estimated using national food consumption and monitoring data of eight European countries. Both the acute and chronic cumulative dietary exposures were calculated according to two model runs (optimistic and pessimistic) as recommended in the EFSA guidance. The exposures obtained with these model runs differed substantially for all countries, with the highest exposures obtained with the pessimistic model run. In this model run, animal commodities including cattle milk and different meat types, entered in the exposure calculations at the level of the maximum residue limit (MRL), contributed most to the exposure. We conclude that application of the optimistic model run on a routine basis for cumulative assessments is feasible. The pessimistic model run is laborious and the exposure results could be too far from reality. More experience with this approach is needed to stimulate the discussion of the feasibility of all the requirements, especially the inclusion of MRLs of animal commodities which seem to result in unrealistic conclusions regarding their contribution to the dietary exposure. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Assessment of Coastal Communities' Vulnerability to Hurricane Surge under Climate Change via Probabilistic Map - A Case Study of the Southwest Coast of Florida

    NASA Astrophysics Data System (ADS)

    Feng, X.; Shen, S.

    2014-12-01

    The US coastline, over the past few years, has been overwhelmed by major storms including Hurricane Katrina (2005), Ike (2008), Irene (2011), and Sandy (2012). Supported by a growing and extensive body of evidence, a majority of research agrees hurricane activities have been enhanced due to climate change. However, the precise prediction of hurricane induced inundation remains a challenge. This study proposed a probabilistic inundation map based on a Statistically Modeled Storm Database (SMSD) to assess the probabilistic coastal inundation risk of Southwest Florida for near-future (20 years) scenario considering climate change. This map was processed through a Joint Probability Method with Optimal-Sampling (JPM-OS), developed by Condon and Sheng in 2012, and accompanied by a high resolution storm surge modeling system CH3D-SSMS. The probabilistic inundation map shows a 25.5-31.2% increase in spatially averaged inundation height compared to an inundation map of present-day scenario. To estimate climate change impacts on coastal communities, socioeconomic analyses were conducted using both the SMSD based probabilistic inundation map and the present-day inundation map. Combined with 2010 census data and 2012 parcel data from Florida Geographic Data Library, the differences of economic loss between the near-future and present day scenarios were used to generate an economic exposure map at census block group level to reflect coastal communities' exposure to climate change. The results show that climate change induced inundation increase has significant economic impacts. Moreover, the impacts are not equally distributed among different social groups considering their social vulnerability to hazards. Social vulnerability index at census block group level were obtained from Hazards and Vulnerability Research Institute. The demographic and economic variables in the index represent a community's adaptability to hazards. Local Moran's I was calculated to identify the clusters of highly exposed and vulnerable communities. The economic-exposure cluster map was overlapped with social-vulnerability cluster map to identify communities with low adaptive capability but high exposure. The result provides decision makers an intuitive tool to identify most susceptible communities for adaptation.

  6. The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals: Dietary Module Version 1: Quick Start Guide, May 2012

    EPA Pesticide Factsheets

    SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.

  7. PROcEED: Probabilistic reverse dosimetry approaches for estimating exposure distributions

    EPA Science Inventory

    As increasing amounts of biomonitoring survey data become available, a new discipline focused on converting such data into estimates of chemical exposures has developed. Reverse dosimetry uses a pharmacokinetic model along with measured biomarker concentrations to determine the p...

  8. The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals: Residential Module Version 4: Quick Start Guide, April 2012

    EPA Pesticide Factsheets

    SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.

  9. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  10. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Clifford Kuofei

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less

  11. SHEDS-HT: An Integrated Probabilistic Exposure Model for ...

    EPA Pesticide Factsheets

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirec

  12. A strategy for understanding noise-induced annoyance

    NASA Astrophysics Data System (ADS)

    Fidell, S.; Green, D. M.; Schultz, T. J.; Pearsons, K. S.

    1988-08-01

    This report provides a rationale for development of a systematic approach to understanding noise-induced annoyance. Two quantitative models are developed to explain: (1) the prevalence of annoyance due to residential exposure to community noise sources; and (2) the intrusiveness of individual noise events. Both models deal explicitly with the probabilistic nature of annoyance, and assign clear roles to acoustic and nonacoustic determinants of annoyance. The former model provides a theoretical foundation for empirical dosage-effect relationships between noise exposure and community response, while the latter model differentiates between the direct and immediate annoyance of noise intrusions and response bias factors that influence the reporting of annoyance. The assumptions of both models are identified, and the nature of the experimentation necessary to test hypotheses derived from the models is described.

  13. APPLICATION AND EVALUATION OF AN AGGREGATE PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL FOR QUANTIFYING CHILDREN'S RESIDENTIAL EXPOSURE AND DOSE TO CHLORPYRIFOS

    EPA Science Inventory

    Critical voids in exposure data and models lead risk assessors to rely on conservative assumptions. Risk assessors and managers need improved tools beyond the screening level analysis to address aggregate exposures to pesticides as required by the Food Quality Protection Act o...

  14. Probabilistic Modeling of Dietary Arsenic Exposure and Dose and Evaluation with 2003-2004 NHANES Data

    EPA Science Inventory

    Dietary exposure from food to toxic inorganic arsenic (iAs) in the general U.S. population has not been well studied. The goal of this research was to quantify dietary arsenic As exposure and analyze the major contributors to total As (tAs) and iAs. Another objective was to com...

  15. Probabilistic simulation of the human factor in structural reliability

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1993-01-01

    A formal approach is described in an attempt to computationally simulate the probable ranges of uncertainties of the human factor in structural probabilistic assessments. A multi-factor interaction equation (MFIE) model has been adopted for this purpose. Human factors such as marital status, professional status, home life, job satisfaction, work load and health, are considered to demonstrate the concept. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Suitability of the MFIE in the subsequently probabilistic sensitivity studies are performed to assess the validity of the whole approach. Results obtained show that the uncertainties for no error range from five to thirty percent for the most optimistic case.

  16. Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.

    PubMed

    Marino, Dale J; Starr, Thomas B

    2007-12-01

    A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case evaluated. Independent draws of PBPK inputs resulted in the slightly higher URFs. Results were also comparable to corresponding values from the previously reported deterministic mouse PBPK and dose-response modeling approach that used LED(10)s to derive potency factors. This finding indicated that the adjustment from ED(10) to LED(10) in the deterministic approach for DCM compensated for variability resulting from probabilistic PBPK and dose-response modeling in the mouse. Finally, results show a similar degree of variability in DCM risk estimates from a number of different sources including the current effort even though these estimates were developed using very different techniques. Given the variety of different approaches involved, 95th percentile-to-mean risk estimate ratios of 2.1-4.1 represent reasonable bounds on variability estimates regarding probabilistic assessments of DCM.

  17. Aggregate exposure modelling of zinc pyrithione in rinse-off personal cleansing products using a person-orientated approach with market share refinement.

    PubMed

    Tozer, Sarah A; Kelly, Seamus; O'Mahony, Cian; Daly, E J; Nash, J F

    2015-09-01

    Realistic estimates of chemical aggregate exposure are needed to ensure consumer safety. As exposure estimates are a critical part of the equation used to calculate acceptable "safe levels" and conduct quantitative risk assessments, methods are needed to produce realistic exposure estimations. To this end, a probabilistic aggregate exposure model was developed to estimate consumer exposure from several rinse off personal cleansing products containing the anti-dandruff preservative zinc pyrithione. The model incorporates large habits and practices surveys, containing data on frequency of use, amount applied, co-use along with market share, and combines these data at the level of the individual based on subject demographics to better estimate exposure. The daily-applied exposure (i.e., amount applied to the skin) was 3.79 mg/kg/day for the 95th percentile consumer. The estimated internal dose for the 95th percentile exposure ranged from 0.01-1.29 μg/kg/day after accounting for retention following rinsing and dermal penetration of ZnPt. This probabilistic aggregate exposure model can be used in the human safety assessment of ingredients in multiple rinse-off technologies (e.g., shampoo, bar soap, body wash, and liquid hand soap). In addition, this model may be used in other situations where refined exposure assessment is required to support a chemical risk assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  19. A practical approach to assess depression risk and to guide risk reduction strategies in later life.

    PubMed

    Almeida, Osvaldo P; Alfonso, Helman; Pirkis, Jane; Kerse, Ngaire; Sim, Moira; Flicker, Leon; Snowdon, John; Draper, Brian; Byrne, Gerard; Goldney, Robert; Lautenschlager, Nicola T; Stocks, Nigel; Scazufca, Marcia; Huisman, Martijn; Araya, Ricardo; Pfaff, Jon

    2011-03-01

    Many factors have been associated with the onset and maintenance of depressive symptoms in later life, although this knowledge is yet to be translated into significant health gains for the population. This study gathered information about common modifiable and non-modifiable risk factors for depression with the aim of developing a practical probabilistic model of depression that can be used to guide risk reduction strategies. A cross-sectional study was undertaken of 20,677 community-dwelling Australians aged 60 years or over in contact with their general practitioner during the preceding 12 months. Prevalent depression (minor or major) according to the Patient Health Questionnaire (PHQ-9) assessment was the main outcome of interest. Other measured exposures included self-reported age, gender, education, loss of mother or father before age 15 years, physical or sexual abuse before age 15 years, marital status, financial stress, social support, smoking and alcohol use, physical activity, obesity, diabetes, hypertension, and prevalent cardiovascular diseases, chronic respiratory diseases and cancer. The mean age of participants was 71.7 ± 7.6 years and 57.9% were women. Depression was present in 1665 (8.0%) of our subjects. Multivariate logistic regression showed depression was independently associated with age older than 75 years, childhood adverse experiences, adverse lifestyle practices (smoking, risk alcohol use, physical inactivity), intermediate health hazards (obesity, diabetes and hypertension), comorbid medical conditions (clinical history of coronary heart disease, stroke, asthma, chronic obstructive pulmonary disease, emphysema or cancers), and social or financial strain. We stratified the exposures to build a matrix that showed that the probability of depression increased progressively with the accumulation of risk factors, from less than 3% for those with no adverse factors to more than 80% for people reporting the maximum number of risk factors. Our probabilistic matrix can be used to estimate depression risk and to guide the introduction of risk reduction strategies. Future studies should now aim to clarify whether interventions designed to mitigate the impact of risk factors can change the prevalence and incidence of depression in later life.

  20. Probabilistic estimation of residential air exchange rates for ...

    EPA Pesticide Factsheets

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER measurements. An algorithm for probabilistically estimating AER was developed based on the Lawrence Berkley National Laboratory Infiltration model utilizing housing characteristics and meteorological data with adjustment for window opening behavior. The algorithm was evaluated by comparing modeled and measured AERs in four US cities (Los Angeles, CA; Detroit, MI; Elizabeth, NJ; and Houston, TX) inputting study-specific data. The impact on the modeled AER of using publically available housing data representative of the region for each city was also assessed. Finally, modeled AER based on region-specific inputs was compared with those estimated using literature-based distributions. While modeled AERs were similar in magnitude to the measured AER they were consistently lower for all cities except Houston. AERs estimated using region-specific inputs were lower than those using study-specific inputs due to differences in window opening probabilities. The algorithm produced more spatially and temporally variable AERs compared with literature-based distributions reflecting within- and between-city differences, helping reduce error in estimates of air pollutant exposure. Published in the Journal of

  1. Estimates of Dietary Exposure to Bisphenol A (BPA) from Light Metal Packaging using Food Consumption and Packaging usage Data: A Refined Deterministic Approach and a Fully Probabilistic (FACET) Approach

    PubMed Central

    Oldring, P.K.T.; Castle, L.; O'Mahony, C.; Dixon, J.

    2013-01-01

    The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19–64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005–0.012 mg dm−2. The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg−1 body weight day−1 for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg−1 body weight day. These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the lowest and the highest estimates from the refined deterministic calculations. Since this should be the case, for a fully probabilistic compared with a deterministic approach, it is concluded that the FACET tool has been verified in this example. A recent EFSA draft opinion on exposure to BPA from different sources showed that canned foods were a major contributor and compared results from various models, including those from FACET. The results from FACET were overall conservative. PMID:24405320

  2. Probabilistic simulation of the human factor in structural reliability

    NASA Astrophysics Data System (ADS)

    Chamis, Christos C.; Singhal, Surendra N.

    1994-09-01

    The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).

  3. Probabilistic Simulation of the Human Factor in Structural Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Singhal, Surendra N.

    1994-01-01

    The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).

  4. Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2009-01-01

    Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.

  5. A Meta-Analysis of Children's Object-to-Mouth Frequency Data for Estimating Non-Dietary Ingestion Exposure

    EPA Science Inventory

    To improve estimates of non-dietary ingestion in probabilistic exposure modeling, a meta-analysis of children's object-to-mouth frequency was conducted using data from seven available studies representing 438 participants and ~ 1500 h of behavior observation. The analysis repres...

  6. A PROBABALISTIC ANALYSIS TO DETERMINE ECOLOGICAL RISK DRIVERS, 10TH VOLUME ASTM STP 1403

    EPA Science Inventory

    A probabilistic analysis of exposure and effect data was used to identify chemicals most likely responsible for ecological risk. The mean and standard deviation of the natural log-transformed chemical data were used to estimate the probability of exposure for an area of concern a...

  7. Prediction of Asbestos Exposure Resulting From Asbestos Aerosolization Determined Using the Releasable Asbestos Field Sampler (RAFS)

    EPA Science Inventory

    Activity-based sampling (ABS) used to evaluate breathing zone exposure to a contaminant present in soil resulting from various activities, involves breathing zone sampling for contaminants while that activity is performed. A probabilistic model based upon aerosol physics and flui...

  8. Probabilistic finite elements for fracture mechanics

    NASA Technical Reports Server (NTRS)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  9. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  11. Probabilistic liquefaction triggering based on the cone penetration test

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.

    2005-01-01

    Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.

  12. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  13. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  14. Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation.

    PubMed

    Pearce, Marcus T

    2018-05-11

    Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  15. Quantifying Children's Aggregate (Dietary and Residential) Exposure and Dose to Permethin: Application and Evaluation of EPA's Probabilistic SHED-Multimedia Model

    EPA Science Inventory

    Reliable, evaluated human exposure and dose models are important for understanding the health risks from chemicals. A case study focusing on permethrin was conducted because of this insecticide’s widespread use and potential health effects. SHEDS-Multimedia was applied to estimat...

  16. August 2007 FIFRA Scientific Advisory Panel Recommendations for SHEDS-Dietary and SHEDS-Residential Modules (Summarized) and EPA Responses

    EPA Science Inventory

    Over the past ten years, the Agency has requested the Panel to review several probabilistic dietary exposure software models. These have included DEEM-FCID™, Calendex-FCID, CARES™, LifeLine™, and an earlier (specialized) version of SHEDS (SHEDS-Wood) designed to assess exposure...

  17. A MODELING FRAMEWORK FOR ESTIMATING CHILDREN'S RESIDENTIAL EXPOSURE AND DOSE TO CHLORPYRIFOS VIA DERMAL RESIDUE CONTACT AND NON-DIETARY INGESTION

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based probabilistic model (Residential Stochastic Human Exposure and Dose Simulation Model for Pesticides; Residential-SHEDS) has been developed to quantify and analyze dermal and non-dietary ingestion exposu...

  18. Levels, sources and probabilistic health risks of polycyclic aromatic hydrocarbons in the agricultural soils from sites neighboring suburban industries in Shanghai.

    PubMed

    Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce

    2018-03-01

    The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Cancer risk from incidental ingestion exposures to PAHs associated with coal-tar-sealed pavement

    USGS Publications Warehouse

    Williams, E. Spencer; Mahler, Barbara J.; Van Metre, Peter C.

    2012-01-01

    Recent (2009-10) studies documented significantly higher concentrations of polycyclic aromatic hydrocarbons (PAHs) in settled house dust in living spaces and soil adjacent to parking lots sealed with coal-tar-based products. To date, no studies have examined the potential human health effects of PAHs from these products in dust and soil. Here we present the results of an analysis of potential cancer risk associated with incidental ingestion exposures to PAHs in settings near coal-tar-sealed pavement. Exposures to benzo[a]pyrene equivalents were characterized across five scenarios. The central tendency estimate of excess cancer risk resulting from lifetime exposures to soil and dust from nondietary ingestion in these settings exceeded 1 × 10–4, as determined using deterministic and probabilistic methods. Soil was the primary driver of risk, but according to probabilistic calculations, reasonable maximum exposure to affected house dust in the first 6 years of life was sufficient to generate an estimated excess lifetime cancer risk of 6 × 10–5. Our results indicate that the presence of coal-tar-based pavement sealants is associated with significant increases in estimated excess lifetime cancer risk for nearby residents. Much of this calculated excess risk arises from exposures to PAHs in early childhood (i.e., 0–6 years of age).

  20. Assessment of global flood exposures - developing an appropriate approach

    NASA Astrophysics Data System (ADS)

    Millinship, Ian; Booth, Naomi

    2015-04-01

    Increasingly complex probabilistic catastrophe models have become the standard for quantitative flood risk assessments by re/insurance companies. On the one hand, probabilistic modelling of this nature is extremely useful; a large range of risk metrics can be output. However, they can be time consuming and computationally expensive to develop and run. Levels of uncertainty are persistently high despite, or perhaps because of, attempts to increase resolution and complexity. A cycle of dependency between modelling companies and re/insurers has developed whereby available models are purchased, models run, and both portfolio and model data 'improved' every year. This can lead to potential exposures in perils and territories that are not currently modelled being largely overlooked by companies, who may then face substantial and unexpected losses when large events occur in these areas. We present here an approach to assessing global flood exposures which reduces the scale and complexity of approach used and begins with the identification of hotspots where there is a significant exposure to flood risk. The method comprises four stages: i) compile consistent exposure information, ii) to apply reinsurance terms and conditions to calculate values exposed, iii) to assess the potential hazard using a global set of flood hazard maps, and iv) to identify potential risk 'hotspots' which include considerations of spatially and/or temporally clustered historical events, and local flood defences. This global exposure assessment is designed as a scoping exercise, and reveals areas or cities where the potential for accumulated loss is of significant interest to a reinsurance company, and for which there is no existing catastrophe model. These regions are then candidates for the development of deterministic scenarios, or probabilistic models. The key advantages of this approach will be discussed. These include simplicity and ability of business leaders to understand results, as well as ease and speed of analysis and the advantages this can offer in terms of monitoring changing exposures over time. Significantly, in many areas of the world, this increase in exposure is likely to have more of an impact on increasing catastrophe losses than potential anthropogenically driven changes in weather extremes.

  1. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2013-07-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  2. PBPK-Based Probabilistic Risk Assessment for Total Chlorotriazines in Drinking Water

    PubMed Central

    Breckenridge, Charles B.; Campbell, Jerry L.; Clewell, Harvey J.; Andersen, Melvin E.; Valdez-Flores, Ciriaco; Sielken, Robert L.

    2016-01-01

    The risk of human exposure to total chlorotriazines (TCT) in drinking water was evaluated using a physiologically based pharmacokinetic (PBPK) model. Daily TCT (atrazine, deethylatrazine, deisopropylatrazine, and diaminochlorotriazine) chemographs were constructed for 17 frequently monitored community water systems (CWSs) using linear interpolation and Krieg estimates between observed TCT values. Synthetic chemographs were created using a conservative bias factor of 3 to generate intervening peaks between measured values. Drinking water consumption records from 24-h diaries were used to calculate daily exposure. Plasma TCT concentrations were updated every 30 minutes using the PBPK model output for each simulated calendar year from 2006 to 2010. Margins of exposure (MOEs) were calculated (MOE = [Human Plasma TCTPOD] ÷ [Human Plasma TCTEXP]) based on the toxicological point of departure (POD) and the drinking water-derived exposure to TCT. MOEs were determined based on 1, 2, 3, 4, 7, 14, 28, or 90 days of rolling average exposures and plasma TCT Cmax, or the area under the curve (AUC). Distributions of MOE were determined and the 99.9th percentile was used for risk assessment. MOEs for all 17 CWSs were >1000 at the 99.9th percentile. The 99.9th percentile of the MOE distribution was 2.8-fold less when the 3-fold synthetic chemograph bias factor was used. MOEs were insensitive to interpolation method, the consumer’s age, the water consumption database used and the duration of time over which the rolling average plasma TCT was calculated, for up to 90 days. MOEs were sensitive to factors that modified the toxicological, or hyphenated appropriately no-observed-effects level (NOEL), including rat strain, endpoint used, method of calculating the NOEL, and the pharmacokinetics of elimination, as well as the magnitude of exposure (CWS, calendar year, and use of bias factors). PMID:26794141

  3. PBPK-Based Probabilistic Risk Assessment for Total Chlorotriazines in Drinking Water.

    PubMed

    Breckenridge, Charles B; Campbell, Jerry L; Clewell, Harvey J; Andersen, Melvin E; Valdez-Flores, Ciriaco; Sielken, Robert L

    2016-04-01

    The risk of human exposure to total chlorotriazines (TCT) in drinking water was evaluated using a physiologically based pharmacokinetic (PBPK) model. Daily TCT (atrazine, deethylatrazine, deisopropylatrazine, and diaminochlorotriazine) chemographs were constructed for 17 frequently monitored community water systems (CWSs) using linear interpolation and Krieg estimates between observed TCT values. Synthetic chemographs were created using a conservative bias factor of 3 to generate intervening peaks between measured values. Drinking water consumption records from 24-h diaries were used to calculate daily exposure. Plasma TCT concentrations were updated every 30 minutes using the PBPK model output for each simulated calendar year from 2006 to 2010. Margins of exposure (MOEs) were calculated (MOE = [Human Plasma TCTPOD] ÷ [Human Plasma TCTEXP]) based on the toxicological point of departure (POD) and the drinking water-derived exposure to TCT. MOEs were determined based on 1, 2, 3, 4, 7, 14, 28, or 90 days of rolling average exposures and plasma TCT Cmax, or the area under the curve (AUC). Distributions of MOE were determined and the 99.9th percentile was used for risk assessment. MOEs for all 17 CWSs were >1000 at the 99.9(th)percentile. The 99.9(th)percentile of the MOE distribution was 2.8-fold less when the 3-fold synthetic chemograph bias factor was used. MOEs were insensitive to interpolation method, the consumer's age, the water consumption database used and the duration of time over which the rolling average plasma TCT was calculated, for up to 90 days. MOEs were sensitive to factors that modified the toxicological, or hyphenated appropriately no-observed-effects level (NOEL), including rat strain, endpoint used, method of calculating the NOEL, and the pharmacokinetics of elimination, as well as the magnitude of exposure (CWS, calendar year, and use of bias factors). © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  4. Estimates of dietary exposure to bisphenol A (BPA) from light metal packaging using food consumption and packaging usage data: a refined deterministic approach and a fully probabilistic (FACET) approach.

    PubMed

    Oldring, P K T; Castle, L; O'Mahony, C; Dixon, J

    2014-01-01

    The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19-64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005-0.012 mg dm(-2). The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg(-1) body weight day(-1) for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg(-1) body weight day(-1). These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the lowest and the highest estimates from the refined deterministic calculations. Since this should be the case, for a fully probabilistic compared with a deterministic approach, it is concluded that the FACET tool has been verified in this example. A recent EFSA draft opinion on exposure to BPA from different sources showed that canned foods were a major contributor and compared results from various models, including those from FACET. The results from FACET were overall conservative.

  5. MODELING OF HUMAN EXPOSURE TO IN-VEHICLE PM2.5 FROM ENVIRONMENTAL TOBACCO SMOKE

    PubMed Central

    Cao, Ye; Frey, H. Christopher

    2012-01-01

    Environmental tobacco smoke (ETS) is estimated to be a significant contributor to in-vehicle human exposure to fine particulate matter of 2.5 µm or smaller (PM2.5). A critical assessment was conducted of a mass balance model for estimating PM2.5 concentration with smoking in a motor vehicle. Recommendations for the range of inputs to the mass-balance model are given based on literature review. Sensitivity analysis was used to determine which inputs should be prioritized for data collection. Air exchange rate (ACH) and the deposition rate have wider relative ranges of variation than other inputs, representing inter-individual variability in operations, and inter-vehicle variability in performance, respectively. Cigarette smoking and emission rates, and vehicle interior volume, are also key inputs. The in-vehicle ETS mass balance model was incorporated into the Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS-PM) model to quantify the potential magnitude and variability of in-vehicle exposures to ETS. The in-vehicle exposure also takes into account near-road incremental PM2.5 concentration from on-road emissions. Results of probabilistic study indicate that ETS is a key contributor to the in-vehicle average and high-end exposure. Factors that mitigate in-vehicle ambient PM2.5 exposure lead to higher in-vehicle ETS exposure, and vice versa. PMID:23060732

  6. Prevalence, risk factors and underdiagnosis of asthma and wheezing in adults 40 years and older: A population-based study.

    PubMed

    Gonzalez-Garcia, Mauricio; Caballero, Andres; Jaramillo, Claudia; Maldonado, Dario; Torres-Duque, Carlos A

    2015-10-01

    There are differences in the prevalence and risk factors of asthma around the world. The epidemiological situation of adults 40 years and older is not well established. Our aim was to determine the prevalence, underdiagnosis and risk factors of asthma and wheezing in adults in Colombia. A cross-sectional, population-based study including 5539 subjects from 40 to 93 years selected by a probabilistic sampling technique in five cities was conducted. respiratory symptoms and risk factors questionnaire and spirometry. (a) Wheezing: Affirmative answer to the question "have you ever had two or more attacks of "wheezes" causing you to feel short of breath?" (b) Asthma: Wheezing definition and FEV1/FVC post-bronchodilator ≥ 70%. (c) Underdiagnosis: Asthma definition without a physician-diagnosis. Logistic regression was used for exploring risk factors. Prevalence of asthma was 9.0% (95% CI: 8.3-9.8) and wheezing 11.9% (95% CI: 11.0-12.8). Asthma underdiagnosis was 69.9% and increased to 79.0% in subjects 64 years or older. The risk factors related to asthma and/or wheezing were: living in Bogota or Medellin, female gender, first degree relative with asthma, respiratory disease before 16 years of age, obesity, no education, indoor wood smoke exposure and occupational exposure to dust particles, gases or fumes. We described the epidemiologic situation of asthma in adults 40 years and older in Colombia. In addition to some recognized risk factors, our data supports the association of indoor wood smoke and occupational exposures with asthma and wheezing. Underdiagnosis of asthma in adults was high, particularly in older subjects.

  7. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  8. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.

  9. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  10. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    NASA Astrophysics Data System (ADS)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  11. A Probabilistic Model of Meter Perception: Simulating Enculturation.

    PubMed

    van der Weij, Bastiaan; Pearce, Marcus T; Honing, Henkjan

    2017-01-01

    Enculturation is known to shape the perception of meter in music but this is not explicitly accounted for by current cognitive models of meter perception. We hypothesize that the induction of meter is a result of predictive coding: interpreting onsets in a rhythm relative to a periodic meter facilitates prediction of future onsets. Such prediction, we hypothesize, is based on previous exposure to rhythms. As such, predictive coding provides a possible explanation for the way meter perception is shaped by the cultural environment. Based on this hypothesis, we present a probabilistic model of meter perception that uses statistical properties of the relation between rhythm and meter to infer meter from quantized rhythms. We show that our model can successfully predict annotated time signatures from quantized rhythmic patterns derived from folk melodies. Furthermore, we show that by inferring meter, our model improves prediction of the onsets of future events compared to a similar probabilistic model that does not infer meter. Finally, as a proof of concept, we demonstrate how our model can be used in a simulation of enculturation. From the results of this simulation, we derive a class of rhythms that are likely to be interpreted differently by enculturated listeners with different histories of exposure to rhythms.

  12. Risk of DDT residue in maize consumed by infants as complementary diet in southwest Ethiopia.

    PubMed

    Mekonen, Seblework; Lachat, Carl; Ambelu, Argaw; Steurbaut, Walter; Kolsteren, Patrick; Jacxsens, Liesbeth; Wondafrash, Mekitie; Houbraken, Michael; Spanoghe, Pieter

    2015-04-01

    Infants in Ethiopia are consuming food items such as maize as a complementary diet. However, this may expose infants to toxic contaminants like DDT. Maize samples were collected from the households visited during a consumption survey and from markets in Jimma zone, southwestern Ethiopia. The residues of total DDT and its metabolites were analyzed using the Quick, Easy, Cheap, Effective, Rugged and Safe (QuEChERS) method combined with dispersive solid phase extraction cleanup (d-SPE). Deterministic and probabilistic methods of analysis were applied to determine the consumer exposure of infants to total DDT. The results from the exposure assessment were compared with the health based guidance value in this case the provisional tolerable daily intake (PTDI). All maize samples (n=127) were contaminated by DDT, with a mean concentration of 1.770 mg/kg, which was far above the maximum residue limit (MRL). The mean and 97.5 percentile (P 97.5) estimated daily intake of total DDT for consumers were respectively 0.011 and 0.309 mg/kg bw/day for deterministic and 0.011 and 0.083 mg/kg bw/day for probabilistic exposure assessment. For total infant population (consumers and non-consumers), the 97.5 percentile estimated daily intake were 0.265 and 0.032 mg/kg bw/day from the deterministic and probabilistic exposure assessments, respectively. Health risk estimation revealed that, the mean and 97.5 percentile for consumers, and 97.5 percentile estimated daily intake of total DDT for total population were above the PTDI. Therefore, in Ethiopia, the use of maize as complementary food for infants may pose a health risk due to DDT residue. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Health risk assessment of ochratoxin A for all age-sex strata in a market economy.

    PubMed

    Kuiper-Goodman, T; Hilts, C; Billiard, S M; Kiparissis, Y; Richard, I D K; Hayward, S

    2010-02-01

    In order to manage risk of ochratoxin A (OTA) in foods, we re-evaluated the tolerable daily intake (TDI), derived the negligible cancer risk intake (NCRI), and conducted a probabilistic risk assessment. A new approach was developed to derive 'usual' probabilistic exposure in the presence of highly variable occurrence data, such as encountered with low levels of OTA. Canadian occurrence data were used for various raw food commodities or finished foods and were combined with US Department of Agriculture (USDA) food consumption data, which included data on infants and young children. Both variability and uncertainty in input data were considered in the resulting exposure estimates for various age/sex strata. Most people were exposed to OTA on a daily basis. Mean adjusted exposures for all age-sex groups were generally below the NCRI of 4 ng OTA kg bw(-1), except for 1-4-year-olds as a result of their lower body weight. For children, the major contributors of OTA were wheat-based foods followed by oats, rice, and raisins. Beer, coffee, and wine also contributed to total OTA exposure in older individuals. Predicted exposure to OTA decreased when European Commission maximum limits were applied to the occurrence data. The impact on risk for regular eaters of specific commodities was also examined.

  14. Methyl Mercury Exposure from Fish Consumption in Vulnerable Racial/Ethnic Populations: Probabilistic SHEDS-Dietary Model Analyses Using 1999-2006 NHANES and 1990-2002 TDS Data

    EPA Science Inventory

    NHANES subjects self-identified as “Asian, Pacific Islander, Native American, or multiracial” (A/P/N/M) have higher levels of blood organic mercury than other racial/ethnic groups; however, the reasons for this have been unclear. This research uses exposure modeling to determine ...

  15. Regional probabilistic risk assessment of heavy metals in different environmental media and land uses: An urbanization-affected drinking water supply area

    NASA Astrophysics Data System (ADS)

    Peng, Chi; Cai, Yimin; Wang, Tieyu; Xiao, Rongbo; Chen, Weiping

    2016-11-01

    In this study, we proposed a Regional Probabilistic Risk Assessment (RPRA) to estimate the health risks of exposing residents to heavy metals in different environmental media and land uses. The mean and ranges of heavy metal concentrations were measured in water, sediments, soil profiles and surface soils under four land uses along the Shunde Waterway, a drinking water supply area in China. Hazard quotients (HQs) were estimated for various exposure routes and heavy metal species. Riverbank vegetable plots and private vegetable plots had 95th percentiles of total HQs greater than 3 and 1, respectively, indicating high risks of cultivation on the flooded riverbank. Vegetable uptake and leaching to groundwater were the two transfer routes of soil metals causing high health risks. Exposure risks during outdoor recreation, farming and swimming along the Shunde Waterway are theoretically safe. Arsenic and cadmium were identified as the priority pollutants that contribute the most risk among the heavy metals. Sensitivity analysis showed that the exposure route, variations in exposure parameters, mobility of heavy metals in soil, and metal concentrations all influenced the risk estimates.

  16. Exposure assessment for trihalomethanes in municipal drinking water and risk reduction strategy.

    PubMed

    Chowdhury, Shakhawat

    2013-10-01

    Lifetime exposure to disinfection byproducts (DBPs) in municipal water may pose risks to human health. Current approaches of exposure assessments use DBPs in cold water during showering, while warming of chlorinated water during showering may increase trihalomethane (THM) formation in the presence of free residual chlorine. Further, DBP exposure through dermal contact during showering is estimated using steady-state condition between the DBPs in shower water impacting on human skin and skin exposed to shower water. The lag times to achieve steady-state condition between DBPs in shower water and human skin can vary in the range of 9.8-391.2 min, while shower duration is often less than the lag times. Assessment of exposure without incorporating these factors might have misinterpreted DBP exposure in some previous studies. In this study, exposure to THMs through ingestion was estimated using cold water THMs, while THM exposure through inhalation and dermal contact during showering was estimated using THMs in warm water. Inhalation of THMs was estimated using THM partition into the shower air, while dermal uptake was estimated by incorporating lag times (e.g., unsteady and steady-state phases of exposure) during showering. Probabilistic approach was followed to incorporate uncertainty in the assessment. Inhalation and dermal contact during showering contributed 25-60% of total exposure. Exposure to THMs during showering can be controlled by varying shower stall volume, shower duration and air exchange rate following power law equations. The findings might be useful in understanding exposure to THMs, which can be extended to other volatile compounds in municipal water. © 2013 Elsevier B.V. All rights reserved.

  17. Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.

    PubMed

    Zhao, Yuchao; Frey, H Christopher

    2004-11-01

    Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.

  18. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  19. Physiologically-Based Toxicokinetic Modeling of Zearalenone and Its Metabolites: Application to the Jersey Girl Study

    PubMed Central

    Mukherjee, Dwaipayan; Royce, Steven G.; Alexander, Jocelyn A.; Buckley, Brian; Isukapalli, Sastry S.; Bandera, Elisa V.; Zarbl, Helmut; Georgopoulos, Panos G.

    2014-01-01

    Zearalenone (ZEA), a fungal mycotoxin, and its metabolite zeranol (ZAL) are known estrogen agonists in mammals, and are found as contaminants in food. Zeranol, which is more potent than ZEA and comparable in potency to estradiol, is also added as a growth additive in beef in the US and Canada. This article presents the development and application of a Physiologically-Based Toxicokinetic (PBTK) model for ZEA and ZAL and their primary metabolites, zearalenol, zearalanone, and their conjugated glucuronides, for rats and for human subjects. The PBTK modeling study explicitly simulates critical metabolic pathways in the gastrointestinal and hepatic systems. Metabolic events such as dehydrogenation and glucuronidation of the chemicals, which have direct effects on the accumulation and elimination of the toxic compounds, have been quantified. The PBTK model considers urinary and fecal excretion and biliary recirculation and compares the predicted biomarkers of blood, urinary and fecal concentrations with published in vivo measurements in rats and human subjects. Additionally, the toxicokinetic model has been coupled with a novel probabilistic dietary exposure model and applied to the Jersey Girl Study (JGS), which involved measurement of mycoestrogens as urinary biomarkers, in a cohort of young girls in New Jersey, USA. A probabilistic exposure characterization for the study population has been conducted and the predicted urinary concentrations have been compared to measurements considering inter-individual physiological and dietary variability. The in vivo measurements from the JGS fall within the high and low predicted distributions of biomarker values corresponding to dietary exposure estimates calculated by the probabilistic modeling system. The work described here is the first of its kind to present a comprehensive framework developing estimates of potential exposures to mycotoxins and linking them with biologically relevant doses and biomarker measurements, including a systematic characterization of uncertainties in exposure and dose estimation for a vulnerable population. PMID:25474635

  20. Health risk assessment and source study of PAHs from roadside soil dust of a heavy mining area in India.

    PubMed

    Tarafdar, Abhrajyoti; Sinha, Alok

    2018-02-26

    The total concentrations of 13 detected polycyclic aromatic hydrocarbons (PAHs) in different traffic soil samples of Dhanbad heavy mining area, India, were between 8.256 and 12.562 µg/g and were dominated by four ring PAHs (44%). Diagnostic ratio study revealed that fossil fuel burning and vehicular pollution are the most prominent sources of the PAHs in roadside soil even at a heavy coal mining area. The 90th percentiles cancer risks determined by probabilistic health risk assessment (Monte Carlo simulations) for both the age groups (children and adults) were above tolerable limit (>1.00E-06) according to USEPA. The simulated mean cancer risk was 1.854E-05 for children and 1.823E-05 for adults. For different exposure pathways, dermal contact was observed to be the major pathway with an exposure load of 74% for children and 85% for adults. Sensitivity analysis demonstrated relative skin adherence factor for soil (AF) is the most influential parameter of the simulation, followed by exposure duration (ED).

  1. PM2.5 Population Exposure in New Delhi Using a Probabilistic Simulation Framework.

    PubMed

    Saraswat, Arvind; Kandlikar, Milind; Brauer, Michael; Srivastava, Arun

    2016-03-15

    This paper presents a Geographical Information System (GIS) based probabilistic simulation framework to estimate PM2.5 population exposure in New Delhi, India. The framework integrates PM2.5 output from spatiotemporal LUR models and trip distribution data using a Gravity model based on zonal data for population, employment and enrollment in educational institutions. Time-activity patterns were derived from a survey of randomly sampled individuals (n = 1012) and in-vehicle exposure was estimated using microenvironmental monitoring data based on field measurements. We simulated population exposure for three different scenarios to capture stay-at-home populations (Scenario 1), working population exposed to near-road concentrations during commutes (Scenario 2), and the working population exposed to on-road concentrations during commutes (Scenario 3). Simulated annual average levels of PM2.5 exposure across the entire city were very high, and particularly severe in the winter months: ∼200 μg m(-3) in November, roughly four times higher compared to the lower levels in the monsoon season. Mean annual exposures ranged from 109 μg m(-3) (IQR: 97-120 μg m(-3)) for Scenario 1, to 121 μg m(-3) (IQR: 110-131 μg m(-3)), and 125 μg m(-3) (IQR: 114-136 μ gm(-3)) for Scenarios 2 and 3 respectively. Ignoring the effects of mobility causes the average annual PM2.5 population exposure to be underestimated by only 11%.

  2. Exposure chamber measurements of mass transfer and partitioning at the plant/air interface.

    PubMed

    Maddalena, Randy L; McKone, Thomas E; Kado, Norman Y

    2002-08-15

    Dynamic measures of air and vegetation concentrations in an exposure chamber and a two-box mass balance model are used to quantify factors that control the rate and extent of chemical partitioning between vegetation and the atmosphere. A continuous stirred flow-through exposure chamber was used to investigate the gas-phase transfer of pollutants between air and plants. A probabilistic two-compartment mass balance model of plant/air exchange within the exposure chamber was developed and used with measured concentrations from the chamber to simultaneously evaluate partitioning (Kpa), overall mass transfer across the plant/air interface (Upa), and loss rates in the atmosphere (Ra) and aboveground vegetation (Rp). The approach is demonstrated using mature Capsicum annuum (bell pepper) plants exposed to phenanthrene (PH), anthracene (AN), fluoranthene (FL) and pyrene (PY). Measured values of log Kpa (V[air]/V[fresh plant]) were 5.7, 5.7, 6.0, and 6.2 for PH, AN, FL, and PY, respectively. Values of Upa (m d(-1)) under the conditions of this study ranged from 42 for PH to 119 for FL. After correcting for wall effects, the estimated reaction half-lives in air were 3, 9, and 25 h for AN, FL and PY. Reaction half-lives in the plant compartment were 17, 6, 17, and 5 d for PH, AN, FL, and PY, respectively. The combined use of exposure chamber measurements and models provides a robust tool for simultaneously measuring several different transfer factors that are important for modeling the uptake of pollutants into vegetation.

  3. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  4. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    PubMed

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. CalTOX (registered trademark), A multimedia total exposure model spreadsheet user's guide. Version 4.0(Beta)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKone, T.E.; Enoch, K.G.

    2002-08-01

    CalTOX has been developed as a set of spreadsheet models and spreadsheet data sets to assist in assessing human exposures from continuous releases to multiple environmental media, i.e. air, soil, and water. It has also been used for waste classification and for setting soil clean-up levels at uncontrolled hazardous wastes sites. The modeling components of CalTOX include a multimedia transport and transformation model, multi-pathway exposure scenario models, and add-ins to quantify and evaluate uncertainty and variability. All parameter values used as inputs to CalTOX are distributions, described in terms of mean values and a coefficient of variation, rather than asmore » point estimates or plausible upper values such as most other models employ. This probabilistic approach allows both sensitivity and uncertainty analyses to be directly incorporated into the model operation. This manual provides CalTOX users with a brief overview of the CalTOX spreadsheet model and provides instructions for using the spreadsheet to make deterministic and probabilistic calculations of source-dose-risk relationships.« less

  6. Evaluation of the risk of perchlorate exposure in a population of late-gestation pregnant women in the United States: Application of probabilistic biologically-based dose response modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lumen, A, E-mail: Annie.Lumen@fda.hhs.gov

    The risk of ubiquitous perchlorate exposure and the dose-response on thyroid hormone levels in pregnant women in the United States (U.S.) have yet to be characterized. In the current work, we integrated a previously developed perchlorate submodel into a recently developed population-based pregnancy model to predict reductions in maternal serum free thyroxine (fT4) levels for late-gestation pregnant women in the U.S. Our findings indicated no significant difference in geometric mean estimates of fT4 when perchlorate exposure from food only was compared to no perchlorate exposure. The reduction in maternal fT4 levels reached statistical significance when an added contribution from drinkingmore » water (i.e., 15 μg/L, 20 μg/L, or 24.5 μg/L) was assumed in addition to the 90th percentile of food intake for pregnant women (0.198 μg/kg/day). We determined that a daily intake of 0.45 to 0.50 μg/kg/day of perchlorate was necessary to produce results that were significantly different than those obtained from no perchlorate exposure. Adjusting for this food intake dose, the relative source contribution of perchlorate from drinking water (or other non-dietary sources) was estimated to range from 0.25–0.3 μg/kg/day. Assuming a drinking water intake rate of 0.033 L/kg/day, the drinking water concentration allowance for perchlorate equates to 7.6–9.2 μg/L. In summary, we have demonstrated the utility of a probabilistic biologically-based dose-response model for perchlorate risk assessment in a sensitive life-stage at a population level; however, there is a need for continued monitoring in regions of the U.S. where perchlorate exposure may be higher. - Highlights: • Probabilistic risk assessment for perchlorate in U.S. pregnant women was conducted. • No significant change in maternal fT4 predicted due to perchlorate from food alone. • Drinking water concentration allowance for perchlorate estimated as 7.6–9.2 μg/L.« less

  7. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Leblois, E.; Onfroy, T.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2014-09-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible (but which have not yet occurred) flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2010 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90 % of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff, due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of the CCR (Caisse Centrale de Reassurance) claim database have shown that approximately 45 % of the insured flood losses are located inside the floodplains and 45 % outside. Another 10 % is due to sea surge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: a generation of fictive river flows based on the historical records of the river gauge network and a generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (Macif) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  8. Electromagnetic Compatibility (EMC) in Microelectronics.

    DTIC Science & Technology

    1983-02-01

    Fault Tree Analysis", System Saftey Symposium, June 8-9, 1965, Seattle: The Boeing Company . 12. Fussell, J.B., "Fault Tree Analysis-Concepts and...procedure for assessing EMC in microelectronics and for applying DD, 1473 EOiTO OP I, NOV6 IS OESOL.ETE UNCLASSIFIED SECURITY CLASSIFICATION OF THIS...CRITERIA 2.1 Background 2 2.2 The Probabilistic Nature of EMC 2 2.3 The Probabilistic Approach 5 2.4 The Compatibility Factor 6 3 APPLYING PROBABILISTIC

  9. Probabilistic framework for the estimation of the adult and child toxicokinetic intraspecies uncertainty factors.

    PubMed

    Pelekis, Michael; Nicolich, Mark J; Gauthier, Joseph S

    2003-12-01

    Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.

  10. Percentiles of the product of uncertainty factors for establishing probabilistic reference doses.

    PubMed

    Gaylor, D W; Kodell, R L

    2000-04-01

    Exposure guidelines for potentially toxic substances are often based on a reference dose (RfD) that is determined by dividing a no-observed-adverse-effect-level (NOAEL), lowest-observed-adverse-effect-level (LOAEL), or benchmark dose (BD) corresponding to a low level of risk, by a product of uncertainty factors. The uncertainty factors for animal to human extrapolation, variable sensitivities among humans, extrapolation from measured subchronic effects to unknown results for chronic exposures, and extrapolation from a LOAEL to a NOAEL can be thought of as random variables that vary from chemical to chemical. Selected databases are examined that provide distributions across chemicals of inter- and intraspecies effects, ratios of LOAELs to NOAELs, and differences in acute and chronic effects, to illustrate the determination of percentiles for uncertainty factors. The distributions of uncertainty factors tend to be approximately lognormally distributed. The logarithm of the product of independent uncertainty factors is approximately distributed as the sum of normally distributed variables, making it possible to estimate percentiles for the product. Hence, the size of the products of uncertainty factors can be selected to provide adequate safety for a large percentage (e.g., approximately 95%) of RfDs. For the databases used to describe the distributions of uncertainty factors, using values of 10 appear to be reasonable and conservative. For the databases examined the following simple "Rule of 3s" is suggested that exceeds the estimated 95th percentile of the product of uncertainty factors: If only a single uncertainty factor is required use 33, for any two uncertainty factors use 3 x 33 approximately 100, for any three uncertainty factors use a combined factor of 3 x 100 = 300, and if all four uncertainty factors are needed use a total factor of 3 x 300 = 900. If near the 99th percentile is desired use another factor of 3. An additional factor may be needed for inadequate data or a modifying factor for other uncertainties (e.g., different routes of exposure) not covered above.

  11. Mosquito control insecticides: a probabilistic ecological risk assessment on drift exposures of naled, dichlorvos (naled metabolite) and permethrin to adult butterflies.

    PubMed

    Hoang, T C; Rand, G M

    2015-01-01

    A comprehensive probabilistic terrestrial ecological risk assessment (ERA) was conducted to characterize the potential risk of mosquito control insecticide (i.e., naled, it's metabolite dichlorvos, and permethrin) usage to adult butterflies in south Florida by comparing the probability distributions of environmental exposure concentrations following actual mosquito control applications at labeled rates from ten field monitoring studies with the probability distributions of butterfly species response (effects) data from our laboratory acute toxicity studies. The overlap of these distributions was used as a measure of risk to butterflies. The long-term viability (survival) of adult butterflies, following topical (thorax/wings) exposures was the environmental value we wanted to protect. Laboratory acute toxicity studies (24-h LD50) included topical exposures (thorax and wings) to five adult butterfly species and preparation of species sensitivity distributions (SSDs). The ERA indicated that the assessment endpoint of protection, of at least 90% of the species, 90% of the time (or the 10th percentile from the acute SSDs) from acute naled and permethrin exposures, is most likely not occurring when considering topical exposures to adults. Although the surface areas for adulticide exposures are greater for the wings, exposures to the thorax provide the highest potential for risk (i.e., SSD 10th percentile is lowest) for adult butterflies. Dichlorvos appeared to present no risk. The results of this ERA can be applied to other areas of the world, where these insecticides are used and where butterflies may be exposed. Since there are other sources (e.g., agriculture) of pesticides in the environment, where butterfly exposures will occur, the ERA may under-estimate the potential risks under real-world conditions. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Object-based attention: strength of object representation and attentional guidance.

    PubMed

    Shomstein, Sarah; Behrmann, Marlene

    2008-01-01

    Two or more features belonging to a single object are identified more quickly and more accurately than are features belonging to different objects--a finding attributed to sensory enhancement of all features belonging to an attended or selected object. However, several recent studies have suggested that this "single-object advantage" may be a product of probabilistic and configural strategic prioritizations rather than of object-based perceptual enhancement per se, challenging the underlying mechanism that is thought to give rise to object-based attention. In the present article, we further explore constraints on the mechanisms of object-based selection by examining the contribution of the strength of object representations to the single-object advantage. We manipulated factors such as exposure duration (i.e., preview time) and salience of configuration (i.e., objects). Varying preview time changes the magnitude of the object-based effect, so that if there is ample time to establish an object representation (i.e., preview time of 1,000 msec), then both probability and configuration (i.e., objects) guide attentional selection. If, however, insufficient time is provided to establish a robust object-based representation, then only probabilities guide attentional selection. Interestingly, at a short preview time of 200 msec, when the two objects were sufficiently different from each other (i.e., different colors), both configuration and probability guided attention selection. These results suggest that object-based effects can be explained both in terms of strength of object representations (established at longer exposure durations and by pictorial cues) and probabilistic contingencies in the visual environment.

  13. Health risk assessment of ochratoxin A for all age-sex strata in a market economy

    PubMed Central

    Kuiper-Goodman, T.; Hilts, C.; Billiard, S.M.; Kiparissis, Y.; Richard, I.D.K.; Hayward, S.

    2009-01-01

    In order to manage risk of ochratoxin A (OTA) in foods, we re-evaluated the tolerable daily intake (TDI), derived the negligible cancer risk intake (NCRI), and conducted a probabilistic risk assessment. A new approach was developed to derive ‘usual’ probabilistic exposure in the presence of highly variable occurrence data, such as encountered with low levels of OTA. Canadian occurrence data were used for various raw food commodities or finished foods and were combined with US Department of Agriculture (USDA) food consumption data, which included data on infants and young children. Both variability and uncertainty in input data were considered in the resulting exposure estimates for various age/sex strata. Most people were exposed to OTA on a daily basis. Mean adjusted exposures for all age-sex groups were generally below the NCRI of 4ng OTA kg bw−1, except for 1–4-year-olds as a result of their lower body weight. For children, the major contributors of OTA were wheat-based foods followed by oats, rice, and raisins. Beer, coffee, and wine also contributed to total OTA exposure in older individuals. Predicted exposure to OTA decreased when European Commission maximum limits were applied to the occurrence data. The impact on risk for regular eaters of specific commodities was also examined. PMID:20013446

  14. Health impact assessment of a skin sensitizer: Analysis of potential policy measures aimed at reducing geraniol concentrations in personal care products and household cleaning products.

    PubMed

    Jongeneel, W P; Delmaar, J E; Bokkers, B G H

    2018-06-08

    A methodology to assess the health impact of skin sensitizers is introduced, which consists of the comparison of the probabilistic aggregated exposure with a probabilistic (individual) human sensitization or elicitation induction dose. The health impact of potential policy measures aimed at reducing the concentration of a fragrance allergen, geraniol, in consumer products is analysed in a simulated population derived from multiple product use surveys. Our analysis shows that current dermal exposure to geraniol from personal care and household cleaning products lead to new cases of contact allergy and induce clinical symptoms for those already sensitized. We estimate that this exposure results yearly in 34 new cases of geraniol contact allergy per million consumers in Western and Northern Europe, mainly due to exposure to household cleaning products. About twice as many consumers (60 per million) are projected to suffer from clinical symptoms due to re-exposure to geraniol. Policy measures restricting geraniol concentrations to <0.01% will noticeably reduce new cases of sensitization and decrease the number of people with clinical symptoms as well as the frequency of occurrence of these clinical symptoms. The estimated numbers should be interpreted with caution and provide only a rough indication of the health impact. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  16. PROBABILISTIC AQUATIC EXPOSURE ASSESSMENT FOR PESTICIDES 1: FOUNDATIONS

    EPA Science Inventory

    Models that capture underlying mechanisms and processes are necessary for reliable extrapolation of laboratory chemical data to field conditions. For validation, these models require a major revision of the conventional model testing paradigm to better recognize the conflict betw...

  17. Metal uptake by homegrown vegetables - the relative importance in human health risk assessments at contaminated sites.

    PubMed

    Augustsson, Anna L M; Uddh-Söderberg, Terese E; Hogmalm, K Johan; Filipsson, Monika E M

    2015-04-01

    Risk assessments of contaminated land often involve the use of generic bioconcentration factors (BCFs), which express contaminant concentrations in edible plant parts as a function of the concentration in soil, in order to assess the risks associated with consumption of homegrown vegetables. This study aimed to quantify variability in BCFs and evaluate the implications of this variability for human exposure assessments, focusing on cadmium (Cd) and lead (Pb) in lettuce and potatoes sampled around 22 contaminated glassworks sites. In addition, risks associated with measured Cd and Pb concentrations in soil and vegetable samples were characterized and a probabilistic exposure assessment was conducted to estimate the likelihood of local residents exceeding tolerable daily intakes. The results show that concentrations in vegetables were only moderately elevated despite high concentrations in soil, and most samples complied with applicable foodstuff legislation. Still, the daily intake of Cd (but not Pb) was assessed to exceed toxicological thresholds for about a fifth of the study population. Bioconcentration factors were found to vary more than indicated by previous studies, but decreasing BCFs with increasing metal concentrations in the soil can explain why the calculated exposure is only moderately affected by the choice of BCF value when generic soil guideline values are exceeded and the risk may be unacceptable. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Regularizing Unpredictable Variation: Evidence from a Natural Language Setting

    ERIC Educational Resources Information Center

    Hendricks, Alison Eisel; Miller, Karen; Jackson, Carrie N.

    2018-01-01

    While previous sociolinguistic research has demonstrated that children faithfully acquire probabilistic input constrained by sociolinguistic and linguistic factors (e.g., gender and socioeconomic status), research suggests children regularize inconsistent input-probabilistic input that is not sociolinguistically constrained (e.g., Hudson Kam &…

  19. A Probabilistic Risk Assessment for Deployed Military Personnel After the Implementation of the Leishmaniasis Control Program at Tallil Air Base, Iraq

    DTIC Science & Technology

    2009-01-01

    used ADE FE (SAfemale/ SAmale ), [4] where ADE is the adjusted dermal exposure (mg/lb [AI]), FE is the ßagger exposure, SAfemale is the sur- face...area of an adult woman as estimated by equation 3, and SAmale is the surface area of an adult man as estimated by equation 3. We assumed a triangular

  20. Quantifying uncertainty in health impact assessment: a case-study example on indoor housing ventilation.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2014-01-01

    Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.

  1. A probabilistic risk assessment for deployed military personnel after the implementation of the "Leishmaniasis Control Program" at Tallil Air Base, Iraq.

    PubMed

    Schleier, Jerome J; Davis, Ryan S; Barber, Loren M; Macedo, Paula A; Peterson, Robert K D

    2009-05-01

    Leishmaniasis has been of concern to the U.S. military and has re-emerged in importance because of recent deployments to the Middle East. We conducted a retrospective probabilistic risk assessment for military personnel potentially exposed to insecticides during the "Leishmaniasis Control Plan" (LCP) undertaken in 2003 at Tallil Air Base, Iraq. We estimated acute and subchronic risks from resmethrin, malathion, piperonyl butoxide (PBO), and pyrethrins applied using a truck-mounted ultra-low-volume (ULV) sprayer and lambda-cyhalothrin, cyfluthrin, bifenthrin, chlorpyrifos, and cypermethrin used for residual sprays. We used the risk quotient (RQ) method for our risk assessment (estimated environmental exposure/toxic endpoint) and set the RQ level of concern (LOC) at 1.0. Acute RQs for truck-mounted ULV and residual sprays ranged from 0.00007 to 33.3 at the 95th percentile. Acute exposure to lambda-cyhalothrin, bifenthrin, and chlorpyrifos exceeded the RQ LOC. Subchronic RQs for truck-mounted ULV and residual sprays ranged from 0.00008 to 32.8 at the 95th percentile. Subchronic exposures to lambda-cyhalothrin and chlorpyrifos exceeded the LOC. However, estimated exposures to lambda-cyhalothrin, bifenthrin, and chlorpyrifos did not exceed their respective no observed adverse effect levels.

  2. Explaining differences between bioaccumulation measurements in laboratory and field data through use of a probabilistic modeling approach

    USGS Publications Warehouse

    Selck, Henriette; Drouillard, Ken; Eisenreich, Karen; Koelmans, Albert A.; Palmqvist, Annemette; Ruus, Anders; Salvito, Daniel; Schultz, Irv; Stewart, Robin; Weisbrod, Annie; van den Brink, Nico W.; van den Heuvel-Greve, Martine

    2012-01-01

    In the regulatory context, bioaccumulation assessment is often hampered by substantial data uncertainty as well as by the poorly understood differences often observed between results from laboratory and field bioaccumulation studies. Bioaccumulation is a complex, multifaceted process, which calls for accurate error analysis. Yet, attempts to quantify and compare propagation of error in bioaccumulation metrics across species and chemicals are rare. Here, we quantitatively assessed the combined influence of physicochemical, physiological, ecological, and environmental parameters known to affect bioaccumulation for 4 species and 2 chemicals, to assess whether uncertainty in these factors can explain the observed differences among laboratory and field studies. The organisms evaluated in simulations including mayfly larvae, deposit-feeding polychaetes, yellow perch, and little owl represented a range of ecological conditions and biotransformation capacity. The chemicals, pyrene and the polychlorinated biphenyl congener PCB-153, represented medium and highly hydrophobic chemicals with different susceptibilities to biotransformation. An existing state of the art probabilistic bioaccumulation model was improved by accounting for bioavailability and absorption efficiency limitations, due to the presence of black carbon in sediment, and was used for probabilistic modeling of variability and propagation of error. Results showed that at lower trophic levels (mayfly and polychaete), variability in bioaccumulation was mainly driven by sediment exposure, sediment composition and chemical partitioning to sediment components, which was in turn dominated by the influence of black carbon. At higher trophic levels (yellow perch and the little owl), food web structure (i.e., diet composition and abundance) and chemical concentration in the diet became more important particularly for the most persistent compound, PCB-153. These results suggest that variation in bioaccumulation assessment is reduced most by improved identification of food sources as well as by accounting for the chemical bioavailability in food components. Improvements in the accuracy of aqueous exposure appear to be less relevant when applied to moderate to highly hydrophobic compounds, because this route contributes only marginally to total uptake. The determination of chemical bioavailability and the increase in understanding and qualifying the role of sediment components (black carbon, labile organic matter, and the like) on chemical absorption efficiencies has been identified as a key next steps.

  3. Probabilistic modeling of bifurcations in single-cell gene expression data using a Bayesian mixture of factor analyzers.

    PubMed

    Campbell, Kieran R; Yau, Christopher

    2017-03-15

    Modeling bifurcations in single-cell transcriptomics data has become an increasingly popular field of research. Several methods have been proposed to infer bifurcation structure from such data, but all rely on heuristic non-probabilistic inference. Here we propose the first generative, fully probabilistic model for such inference based on a Bayesian hierarchical mixture of factor analyzers. Our model exhibits competitive performance on large datasets despite implementing full Markov-Chain Monte Carlo sampling, and its unique hierarchical prior structure enables automatic determination of genes driving the bifurcation process. We additionally propose an Empirical-Bayes like extension that deals with the high levels of zero-inflation in single-cell RNA-seq data and quantify when such models are useful. We apply or model to both real and simulated single-cell gene expression data and compare the results to existing pseudotime methods. Finally, we discuss both the merits and weaknesses of such a unified, probabilistic approach in the context practical bioinformatics analyses.

  4. Cancer Risk Assessment of Polycyclic Aromatic Hydrocarbons in the Soils and Sediments of India: A Meta-Analysis.

    PubMed

    Tarafdar, Abhrajyoti; Sinha, Alok

    2017-10-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons in soils and sediments was conducted using the probabilistic approach from a national perspective. Published monitoring data of polycyclic aromatic hydrocarbons present in soils and sediments at different study points across India were collected and converted to their corresponding BaP equivalent concentrations. These BaP equivalent concentrations were used to evaluate comprehensive cancer risk for two different age groups. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The analysis denotes 90% cancer risk value of 1.770E-5 for children and 3.156E-5 for adults at heavily polluted site soils. Overall carcinogenic risks of polycyclic aromatic hydrocarbons in soils of India were mostly in acceptance limits. However, the food ingestion exposure route for sediments leads them to a highly risked zone. The 90% risk values from sediments are 7.863E-05 for children and 3.999E-04 for adults. Sensitivity analysis reveals exposure duration and relative skin adherence factor for soil as the most influential parameter of the assessment, followed by BaP equivalent concentration of polycyclic aromatic hydrocarbons. For sediments, biota to sediment accumulation factor of fish in terms of BaP is most sensitive on the total outcome, followed by BaP equivalent and exposure duration. Individual exposure route analysis showed dermal contact for soils and food ingestion for sediments as the main exposure pathway. Some specific locations such as surrounding areas of Bhavnagar, Raniganj, Sunderban, Raipur, and Delhi demand potential strategies of carcinogenic risk management and reduction. The current study is probably the first attempt to provide information on the carcinogenic risk of polycyclic aromatic hydrocarbons in soil and sediments across India.

  5. Cancer Risk Assessment of Polycyclic Aromatic Hydrocarbons in the Soils and Sediments of India: A Meta-Analysis

    NASA Astrophysics Data System (ADS)

    Tarafdar, Abhrajyoti; Sinha, Alok

    2017-10-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons in soils and sediments was conducted using the probabilistic approach from a national perspective. Published monitoring data of polycyclic aromatic hydrocarbons present in soils and sediments at different study points across India were collected and converted to their corresponding BaP equivalent concentrations. These BaP equivalent concentrations were used to evaluate comprehensive cancer risk for two different age groups. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The analysis denotes 90% cancer risk value of 1.770E-5 for children and 3.156E-5 for adults at heavily polluted site soils. Overall carcinogenic risks of polycyclic aromatic hydrocarbons in soils of India were mostly in acceptance limits. However, the food ingestion exposure route for sediments leads them to a highly risked zone. The 90% risk values from sediments are 7.863E-05 for children and 3.999E-04 for adults. Sensitivity analysis reveals exposure duration and relative skin adherence factor for soil as the most influential parameter of the assessment, followed by BaP equivalent concentration of polycyclic aromatic hydrocarbons. For sediments, biota to sediment accumulation factor of fish in terms of BaP is most sensitive on the total outcome, followed by BaP equivalent and exposure duration. Individual exposure route analysis showed dermal contact for soils and food ingestion for sediments as the main exposure pathway. Some specific locations such as surrounding areas of Bhavnagar, Raniganj, Sunderban, Raipur, and Delhi demand potential strategies of carcinogenic risk management and reduction. The current study is probably the first attempt to provide information on the carcinogenic risk of polycyclic aromatic hydrocarbons in soil and sediments across India.

  6. Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Bryant, Larry

    2014-01-01

    Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.

  7. Uncertainty in exposure to air pollution

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer; Helle, Kristina; Christoph, Stasch; Rasouli, Soora; Timmermans, Harry; Walker, Sam-Erik; Denby, Bruce

    2013-04-01

    To assess exposure to air pollution for a person or for a group of people, one needs to know where the person or group is as a function of time, and what the air pollution is at these times and locations. In this study we used the Albatross activity-based model to assess the whereabouts of people and the uncertainties in this, and a probabilistic air quality system based on TAPM/EPISODE to assess air quality probabilistically. The outcomes of the two models were combined to assess exposure to air pollution, and the errors in it. We used the area around Rotterdam (Netherlands) as a case study. As the outcomes of both models come as Monte Carlo realizations, it was relatively easy to cancel one of the sources of uncertainty (movement of persons, air pollution) in order to identify their respective contributions, and also to compare evaluations for individuals with averages for a population of persons. As the output is probabilistic, and in addition spatially and temporally varying, the visual analysis of the complete results poses some challenges. This case study was one of the test cases in the UncertWeb project, which has built concepts and tools to realize the uncertainty-enabled model web. Some of the tools and protocols will be shown and evaluated in this presentation. For the uncertainty of exposure, the uncertainty of air quality was more important than the uncertainty of peoples locations. This difference was stronger for PM10 than for NO2. The workflow was implemented as generic Web services in UncertWeb that also allow for other inputs than the simulated activity schedules and air quality with other resolution. However, due to this flexibility, the Web services require standardized formats and the overlay algorithm is not optimized for the specific use case resulting in a data and processing overhead. Hence, we implemented the full analysis in parallel in R, for this specific case as the model web solution had difficulties with massive data.

  8. Probabilistic sizing of laminates with uncertainties

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Liaw, D. G.; Chamis, C. C.

    1993-01-01

    A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.

  9. CES_EHP_Figure_2

    EPA Pesticide Factsheets

    The increasing number of chemicals for which SHEDS probabilistic exposure assessment has been performed over the yearsThis dataset is associated with the following publication:Egeghy , P., L. Sheldon, K. Isaacs , H. Ozkaynak, M. Goldsmith, J. Wambaugh , R. Judson , and T. Buckley. Computational Exposure Science: An Emerging Discipline to Support 21st-Century Risk Assessment. ENVIRONMENTAL HEALTH PERSPECTIVES. National Institute of Environmental Health Sciences (NIEHS), Research Triangle Park, NC, USA, 124(6): 697–702, (2016).

  10. PROBABILISTIC RISK ASSESSMENT FOR THE EFFECTS OF SOLAR ULTRAVIOLET RADIATION ON AMPHIBIANS

    EPA Science Inventory

    Several studies have demonstrated that exposure to solar ultraviolet (UV) radiation can cause elevated mortality and an increased prevalence of eye and limb malformations in developing amphibian larvae. From these observations scientists have hypothesized that recent increases in...

  11. Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.

    2006-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.

  12. Managing Space Radiation Risks On Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.

    2005-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.

  13. Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.; Ponomarev, A.; Ren, L.; Shavers, M. R.; Wu, H.

    2005-01-01

    Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.

  14. Meta-Analysis of Lead (Pb) in Multiple Environmental Media in the United States

    EPA Science Inventory

    Introduction: The U.S. Environmental Protection Agency, Office of Research and Development, conducts probabilistic multimedia lead (Pb) exposure modeling to inform the development of health-based benchmarks for Pb in the environment. For this modeling, robust Pb concentration dat...

  15. Use of risk quotient and probabilistic approaches to assess risks of pesticides to birds

    EPA Science Inventory

    When conducting ecological risk assessments for pesticides, the United States Environmental Protection Agency typically relies upon the risk quotient (RQ). This approach is intended to be conservative in nature, making assumptions related to exposure and effects that are intended...

  16. Assessing doses to terrestrial wildlife at a radioactive waste disposal site: inter-comparison of modelling approaches.

    PubMed

    Johansen, M P; Barnett, C L; Beresford, N A; Brown, J E; Černe, M; Howard, B J; Kamboj, S; Keum, D-K; Smodiš, B; Twining, J R; Vandenhove, H; Vives i Batlle, J; Wood, M D; Yu, C

    2012-06-15

    Radiological doses to terrestrial wildlife were examined in this model inter-comparison study that emphasised factors causing variability in dose estimation. The study participants used varying modelling approaches and information sources to estimate dose rates and tissue concentrations for a range of biota types exposed to soil contamination at a shallow radionuclide waste burial site in Australia. Results indicated that the dominant factor causing variation in dose rate estimates (up to three orders of magnitude on mean total dose rates) was the soil-to-organism transfer of radionuclides that included variation in transfer parameter values as well as transfer calculation methods. Additional variation was associated with other modelling factors including: how participants conceptualised and modelled the exposure configurations (two orders of magnitude); which progeny to include with the parent radionuclide (typically less than one order of magnitude); and dose calculation parameters, including radiation weighting factors and dose conversion coefficients (typically less than one order of magnitude). Probabilistic approaches to model parameterisation were used to encompass and describe variable model parameters and outcomes. The study confirms the need for continued evaluation of the underlying mechanisms governing soil-to-organism transfer of radionuclides to improve estimation of dose rates to terrestrial wildlife. The exposure pathways and configurations available in most current codes are limited when considering instances where organisms access subsurface contamination through rooting, burrowing, or using different localised waste areas as part of their habitual routines. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.

  17. Environmental prediction, risk assessment and extreme events: adaptation strategies for the developing world

    PubMed Central

    Webster, Peter J.; Jian, Jun

    2011-01-01

    The uncertainty associated with predicting extreme weather events has serious implications for the developing world, owing to the greater societal vulnerability to such events. Continual exposure to unanticipated extreme events is a contributing factor for the descent into perpetual and structural rural poverty. We provide two examples of how probabilistic environmental prediction of extreme weather events can support dynamic adaptation. In the current climate era, we describe how short-term flood forecasts have been developed and implemented in Bangladesh. Forecasts of impending floods with horizons of 10 days are used to change agricultural practices and planning, store food and household items and evacuate those in peril. For the first time in Bangladesh, floods were anticipated in 2007 and 2008, with broad actions taking place in advance of the floods, grossing agricultural and household savings measured in units of annual income. We argue that probabilistic environmental forecasts disseminated to an informed user community can reduce poverty caused by exposure to unanticipated extreme events. Second, it is also realized that not all decisions in the future can be made at the village level and that grand plans for water resource management require extensive planning and funding. Based on imperfect models and scenarios of economic and population growth, we further suggest that flood frequency and intensity will increase in the Ganges, Brahmaputra and Yangtze catchments as greenhouse-gas concentrations increase. However, irrespective of the climate-change scenario chosen, the availability of fresh water in the latter half of the twenty-first century seems to be dominated by population increases that far outweigh climate-change effects. Paradoxically, fresh water availability may become more critical if there is no climate change. PMID:22042897

  18. Consumption of fruits and vegetables and probabilistic assessment of the cumulative acute exposure to organophosphorus and carbamate pesticides of schoolchildren in Slovenia.

    PubMed

    Blaznik, Urška; Yngve, Agneta; Eržen, Ivan; Hlastan Ribič, Cirila

    2016-02-01

    Adequate consumption of fruits and vegetables is a part of recommendations for a healthy diet. The aim of the present study was to assess acute cumulative dietary exposure to organophosphorus and carbamate pesticides via fruit and vegetable consumption by the population of schoolchildren aged 11-12 years and the level of risk for their health. Cumulative probabilistic risk assessment methodology with the index compound approach was applied. Slovenia, primary schools. Schoolchildren (n 1145) from thirty-one primary schools in Slovenia. Children were part of the PRO GREENS study 2009/10 which assessed 11-year-olds' consumption of fruit and vegetables in ten European countries. The cumulative acute exposure amounted to 8.3 (95% CI 7.7, 10.6) % of the acute reference dose (ARfD) for acephate as index compound (100 µg/kg body weight per d) at the 99.9th percentile for daily intake and to 4.5 (95% CI 3.5, 4.7) % of the ARfD at the 99.9th percentile for intakes during school time and at lunch. Apples, bananas, oranges and lettuce contributed most to the total acute pesticides intake. The estimations showed that acute dietary exposure to organophosphorus and carbamate pesticides is not a health concern for schoolchildren with the assessed dietary patterns of fruit and vegetable consumption.

  19. Subsea release of oil from a riser: an ecological risk assessment.

    PubMed

    Nazir, Muddassir; Khan, Faisal; Amyotte, Paul; Sadiq, Rehan

    2008-10-01

    This study illustrates a newly developed methodology, as a part of the U.S. EPA ecological risk assessment (ERA) framework, to predict exposure concentrations in a marine environment due to underwater release of oil and gas. It combines the hydrodynamics of underwater blowout, weathering algorithms, and multimedia fate and transport to measure the exposure concentration. Naphthalene and methane are used as surrogate compounds for oil and gas, respectively. Uncertainties are accounted for in multimedia input parameters in the analysis. The 95th percentile of the exposure concentration (EC(95%)) is taken as the representative exposure concentration for the risk estimation. A bootstrapping method is utilized to characterize EC(95%) and associated uncertainty. The toxicity data of 19 species available in the literature are used to calculate the 5th percentile of the predicted no observed effect concentration (PNEC(5%)) by employing the bootstrapping method. The risk is characterized by transforming the risk quotient (RQ), which is the ratio of EC(95%) to PNEC(5%), into a cumulative risk distribution. This article describes a probabilistic basis for the ERA, which is essential from risk management and decision-making viewpoints. Two case studies of underwater oil and gas mixture release, and oil release with no gaseous mixture are used to show the systematic implementation of the methodology, elements of ERA, and the probabilistic method in assessing and characterizing the risk.

  20. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 5: Auxiliary shuttle risk analyses

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.

  1. Probalistic Assessment of Radiation Risk for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2008-01-01

    For long duration missions outside of the protection of the Earth's magnetic field, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon or Earth-to-Mars transit. The large majority (90%) of SPEs have small or no health consequences because the doses are low and the particles do not penetrate to organ depths. However, there is an operational challenge to respond to events of unknown size and duration. We have developed a probabilistic approach to SPE risk assessment in support of mission design and operational planning. Using the historical database of proton measurements during the past 5 solar cycles, the functional form of hazard function of SPE occurrence per cycle was found for nonhomogeneous Poisson model. A typical hazard function was defined as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions of particle fluences for a specified mission period were simulated ranging from its 5th to 95th percentile. Organ doses from large SPEs were assessed using NASA's Baryon transport model, BRYNTRN. The SPE risk was analyzed with the organ dose distribution for the given particle fluences during a mission period. In addition to the total particle fluences of SPEs, the detailed energy spectra of protons, especially at high energy levels, were recognized as extremely important for assessing the cancer risk associated with energetic particles for large events. The probability of exceeding the NASA 30-day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated for various SPE sizes. This probabilistic approach to SPE protection will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks in future work.

  2. Reliability, Risk and Cost Trade-Offs for Composite Designs

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1996-01-01

    Risk and cost trade-offs have been simulated using a probabilistic method. The probabilistic method accounts for all naturally-occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry and loading conditions. The probability density function of first buckling load for a set of uncertain variables is computed. The probabilistic sensitivity factors of uncertain variables to the first buckling load is calculated. The reliability-based cost for a composite fuselage panel is defined and minimized with respect to requisite design parameters. The optimization is achieved by solving a system of nonlinear algebraic equations whose coefficients are functions of probabilistic sensitivity factors. With optimum design parameters such as the mean and coefficient of variation (representing range of scatter) of uncertain variables, the most efficient and economical manufacturing procedure can be selected. In this paper, optimum values of the requisite design parameters for a predetermined cost due to failure occurrence are computationally determined. The results for the fuselage panel analysis show that the higher the cost due to failure occurrence, the smaller the optimum coefficient of variation of fiber modulus (design parameter) in longitudinal direction.

  3. Spatial variability versus parameter uncertainty in freshwater fate and exposure factors of chemicals.

    PubMed

    Nijhof, Carl O P; Huijbregts, Mark A J; Golsteijn, Laura; van Zelm, Rosalie

    2016-04-01

    We compared the influence of spatial variability in environmental characteristics and the uncertainty in measured substance properties of seven chemicals on freshwater fate factors (FFs), representing the residence time in the freshwater environment, and on exposure factors (XFs), representing the dissolved fraction of a chemical. The influence of spatial variability was quantified using the SimpleBox model in which Europe was divided in 100 × 100 km regions, nested in a regional (300 × 300 km) and supra-regional (500 × 500 km) scale. Uncertainty in substance properties was quantified by means of probabilistic modelling. Spatial variability and parameter uncertainty were expressed by the ratio k of the 95%ile and 5%ile of the FF and XF. Our analysis shows that spatial variability ranges in FFs of persistent chemicals that partition predominantly into one environmental compartment was up to 2 orders of magnitude larger compared to uncertainty. For the other (less persistent) chemicals, uncertainty in the FF was up to 1 order of magnitude larger than spatial variability. Variability and uncertainty in freshwater XFs of the seven chemicals was negligible (k < 1.5). We found that, depending on the chemical and emission scenario, accounting for region-specific environmental characteristics in multimedia fate modelling, as well as accounting for parameter uncertainty, can have a significant influence on freshwater fate factor predictions. Therefore, we conclude that it is important that fate factors should not only account for parameter uncertainty, but for spatial variability as well, as this further increases the reliability of ecotoxicological impacts in LCA. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Cloud immersion building shielding factors for US residential structures.

    PubMed

    Dickson, E D; Hamby, D M

    2014-12-01

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.

  5. Elasto-limited plastic analysis of structures for probabilistic conditions

    NASA Astrophysics Data System (ADS)

    Movahedi Rad, M.

    2018-06-01

    With applying plastic analysis and design methods, significant saving in material can be obtained. However, as a result of this benefit excessive plastic deformations and large residual displacements might develop, which in turn might lead to unserviceability and collapse of the structure. In this study, for deterministic problem the residual deformation of structures is limited by considering a constraint on the complementary strain energy of the residual forces. For probabilistic problem the constraint for the complementary strain energy of the residual forces is given randomly and critical stresses updated during the iteration. Limit curves are presented for the plastic limit load factors. The results show that these constraints have significant effects on the load factors. The formulations of the deterministic and probabilistic problems lead to mathematical programming which are solved by the use of nonlinear algorithm.

  6. DISCOUNTING OF DELAYED AND PROBABILISTIC LOSSES OVER A WIDE RANGE OF AMOUNTS

    PubMed Central

    Green, Leonard; Myerson, Joel; Oliveira, Luís; Chang, Seo Eun

    2014-01-01

    The present study examined delay and probability discounting of hypothetical monetary losses over a wide range of amounts (from $20 to $500,000) in order to determine how amount affects the parameters of the hyperboloid discounting function. In separate conditions, college students chose between immediate payments and larger, delayed payments and between certain payments and larger, probabilistic payments. The hyperboloid function accurately described both types of discounting, and amount of loss had little or no systematic effect on the degree of discounting. Importantly, the amount of loss also had little systematic effect on either the rate parameter or the exponent of the delay and probability discounting functions. The finding that the parameters of the hyperboloid function remain relatively constant across a wide range of amounts of delayed and probabilistic loss stands in contrast to the robust amount effects observed with delayed and probabilistic rewards. At the individual level, the degree to which delayed losses were discounted was uncorrelated with the degree to which probabilistic losses were discounted, and delay and probability loaded on two separate factors, similar to what is observed with delayed and probabilistic rewards. Taken together, these findings argue that although delay and probability discounting involve fundamentally different decision-making mechanisms, nevertheless the discounting of delayed and probabilistic losses share an insensitivity to amount that distinguishes it from the discounting of delayed and probabilistic gains. PMID:24745086

  7. Monitoring and exposure assessment of pesticide residues in cowpea (Vigna unguiculata L. Walp) from five provinces of southern China.

    PubMed

    Huan, Zhibo; Xu, Zhi; Luo, Jinhui; Xie, Defang

    2016-11-01

    Residues of 14 pesticides were determined in 150 cowpea samples collected in five southern Chinese provinces in 2013 and 2014.70% samples were detected one or more residues. 61.3% samples were illegal mainly because of detection of unauthorized pesticides. 14.0% samples contained more than three pesticides. Deterministic and probabilistic methods were used to assess the chronic and acute risk of pesticides in cowpea to eight subgroups of people. Deterministic assessment showed that the estimated short-term intakes (ESTIs) of carbofuran were 1199.4%-2621.9% of the acute reference doses (ARfD) while the rates were 985.9%-4114.7% using probabilistic assessment. Probabilistic assessment showed 4.2%-7.8% subjects may suffer from unacceptable acute risk from carbofuran contaminated cowpeas from the five provinces (especially children). But undue concern is not necessary, because all the estimations are based on conservative assumption. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Probabilistic risk analysis of building contamination.

    PubMed

    Bolster, D T; Tartakovsky, D M

    2008-10-01

    We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.

  9. Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion

    NASA Astrophysics Data System (ADS)

    Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.

    2017-03-01

    Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small image patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.

  10. Exposures to diesel exhaust in the International Brotherhood of Teamsters, 1950-1990.

    PubMed

    Bailey, Chad R; Somers, Joseph H; Steenland, Kyle

    2003-01-01

    A prior case-control study found a positive, monotonic exposure-response relationship between exposure to diesel exhaust and lung cancer among decedents of the Central States Conference of the International Brotherhood of Teamsters. In response to critiques of the Teamsters' exposure estimates by the Health Effects Institute's Diesel Epidemiology Panel, historical exposures and associated uncertainties are investigated here. Historic diesel exhaust exposures are predicted as a function of heavy-duty diesel truck emissions, increasing use of diesel engines, and occupational elemental carbon (EC) measurements taken during the late 1980s and early 1990s. EC from diesel and nondiesel sources is distinguished in light of recent studies indicating a substantial contribution of gasoline vehicles to ambient EC. Monte Carlo sampling is used to characterize exposure distributions. The methodology used in this article-a probabilistic model for historical exposure assessment-is novel.

  11. Probabilistic Modeling of Childhood Multimedia Lead Exposures: Examining the Soil Ingestion Pathway

    EPA Science Inventory

    BACKGROUND: Drinking water and other sources for lead are the subject of public health concerns around the Flint, Michigan, drinking water and East Chicago, Indiana, lead in soil crises. In 2015, the U.S. Environmental Protection Agency (EPA)’s National Drinking Water Advis...

  12. Developing probabilistic models to predict amphibian site occupancy in a patchy landscape

    Treesearch

    R. A. Knapp; K.R. Matthews; H. K. Preisler; R. Jellison

    2003-01-01

    Abstract. Human-caused fragmentation of habitats is threatening an increasing number of animal and plant species, making an understanding of the factors influencing patch occupancy ever more important. The overall goal of the current study was to develop probabilistic models of patch occupancy for the mountain yellow-legged frog (Rana muscosa). This once-common species...

  13. Retrospective and current risks of mercury to panthers in the Florida Everglades.

    PubMed

    Barron, Mace G; Duvall, Stephanie E; Barron, Kyle J

    2004-04-01

    Florida panthers are an endangered species inhabiting south Florida. Hg has been suggested as a causative factor for low populations and some reported panther deaths, but a quantitative assessment of risks has never been performed. This study quantitatively evaluated retrospective (pre-1992) and current (2002) risks of chronic dietary Hg exposures to panthers in the Florida Everglades. A probabilistic assessment of Hg risks was performed using a dietary exposure model and Latin Hypercube sampling that incorporated the variability and uncertainty in ingestion rate, diet, body weight, and mercury exposure of panthers. Hazard quotients (HQs) for retrospective risks ranged from less than 0.1-20, with a 46% probability of exceeding chronic dietary thresholds for methylmercury. Retrospective risks of developing clinical symptoms, including ataxia and convulsions, had an HQ range of <0.1-5.4 with a 17% probability of exceeding an HQ of 1. Current risks were substantially lower (4% probability of exceedences; HQ range <0.1-3.5) because of an estimated 70-90% decline in Hg exposure to panthers over the last decade. Under worst case conditions of panthers consuming only raccoons from the most contaminated area of the Everglades, current risks of developing clinical symptoms that may lead to death was 4.6%. Current risks of mercury poisoning of panthers with a diversified diet was 0.1% (HQ range of <0.1-1.4). The results of this assessment indicate that past Hg exposures likely adversely affected panthers in the Everglades, but current risks of Hg are low.

  14. Longitudinal modelling of the exposure of young UK patients with PKU to acesulfame K and sucralose.

    PubMed

    O'Sullivan, Aaron J; Pigat, Sandrine; O'Mahony, Cian; Gibney, Michael J; McKevitt, Aideen I

    2017-11-01

    Artificial sweeteners are used in protein substitutes intended for the dietary management of inborn errors of metabolism (phenylketonuria, PKU) to improve the variety of medical foods available to patients and ensure dietary adherence to the prescribed course of dietary management. These patients can be exposed to artificial sweeteners from the combination of free and prescribed foods. Young children have a higher risk of exceeding acceptable daily intakes (ADI) for additives than adults, due to higher food intakes per kg body weight. Young patients with PKU aged 1-3 years can be exposed to higher levels of artificial sweeteners from these dual sources than normal healthy children and are at a higher risk of exceeding the ADI. Standard intake assessment methods are not adequate to assess the additive exposure of young patients with PKU. The aim of this study was to estimate the combination effect on the intake of artificial sweeteners and the impact of the introduction of new provisions for an artificial sweetener (sucralose, E955) on exposure of PKU patients using a validated probabilistic model. Food consumption data were derived from the food consumption survey data of healthy young children in the United Kingdom from the National Diet and Nutrition Survey (NDNS, 1992-2012). Specially formulated protein substitutes as foods for special medical purposes (FSMPs) were included in the exposure model to replace restricted foods. Inclusion of these protein substitutes is based on recommendations to ensure adequate protein intake in these patients. Exposure assessment results indicated the availability of sucralose for use in FSMPs for PKU leads to changes in intakes in young patients. These data further support the viability of probabilistic modelling as a means to estimate food additive exposure in patients consuming medical nutrition products.

  15. Exposure to School and Community Based Prevention Programs and Reductions in Cigarette Smoking among Adolescents in the United States, 2000–08

    PubMed Central

    Chen, Xinguang; Ren, Yuanjing; Lin, Feng; MacDonell, Karen; Jiang, Yifan

    2011-01-01

    Smoking remains prevalent among U.S. youth despite decades of antismoking efforts. Effects from exposure to prevention programs at national level may provide informative and compelling data supporting better planning and strategy for tobacco control. A national representative sample of youth 12–17 years of age from the National Survey on Drug Use and Health was analyzed. A 3-stage model was devised to estimate smoking behavior transitions using cross-sectional data and the Probabilistic Discrete Event System method. Cigarette smoking measures (prevalence rates and odds ratios) were compared between exposed and non-exposed youth. More than 95% of the sample was exposed to prevention programs. Exposure was negatively associated with lifetime smoking and past 30-day smoking with a dose-response relation. Reduction in smoking was related to increased quitting in 2000–02, to increased quitting and declined initiation in 2003–05, and to initiation, quitting and relapse in 2005–08. Findings of this analysis suggest that intervention programs in the United States can reduce cigarette smoking among youth. Quitting smoking was most responsive to program exposure and relapse was most sensitive to funding cuts since 2003. Health policy and decision makers should consider these factors in planning and revising tobacco control strategies. PMID:22410164

  16. Two forms of persistence in visual information processing.

    PubMed

    Di Lollo, Vincent; Dixon, Peter

    1988-11-01

    Iconic memory, which was initially regarded as a unitary phenomenon, has since been subdivided into several components. In the present work we examined the joint effects of two such components (visible persistence and the visual analog representation) on performance in a partial report task. The display consisted of 15 alphabetic characters arranged around the perimeter of an imaginary circle on the face of an oscilloscope. The observer named the character singled out by a bar-probe. Two factors were varied: exposure duration of the array (10, 50, 100, 150, 200, 300, 400 or 500 ms) and duration of blank period (interstimulus interval, ISI) between the termination of the array and the onset of the probe (0, 50, 100, 150, or 200 ms). Performance was progressively impaired as both exposure duration and ISI were increased. The results were explained in terms of a probabilistic combinatorial model in which the timecourses of visible persistence and of the visual analog representation are regarded as time-locked to the onset and to the end of stimulation, respectively. The impairing effect of exposure duration was attributed to the relatively high spatial demands of the task that could be met optimally by information in visible persistence (which declines as a function of exposure duration), but less adequately by information in the visual analog representation. A second experiment, employing a task with lesser spatial demands, confirmed this interpretation.

  17. A lifestyle-based scenario for U.S. buildings: Implications for energy use

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diamond, Rick

    Dynamic measures of air and vegetation concentrations in an exposure chamber and a two-box mass balance model are used to quantify factors that control the rate and extent of chemical partitioning between vegetation and the atmosphere. A continuous stirred flow-through exposure chamber was used to investigate the gas-phase transfer of pollutants between air and plants. A probabilistic two-compartment mass-balance model of plant/air exchange within the exposure chamber was developed and used with measured concentrations from the chamber to simultaneously evaluate partitioning (K{sub pa}), overall mass transfer across the plant/air interface (U{sub pa}) and loss rates in the atmosphere (R{sub a})more » and aboveground vegetation (R{sub p}). The approach is demonstrated using mature Capsicum annuum (bell pepper) plants exposed to phenanthrene (PH), anthracene (AN), fluoranthene (FL) and pyrene (PY). Measured values of log K{sub pa} (V{sub [air]}/V{sub [fresh plant]}) were 5.7, 5.7, 6.0 and 6.2 for PH, AN, FL and PY, respectively. Values of U{sub pa} (m d{sup -1}) under the conditions of this study ranged from 42 for PH to 119 for FL. After correcting for wall effects, the estimated reaction half-lives in air were 3, 9 and 25 hours for AN, FL and PY. Reaction half-lives in the plant compartment were 17, 6, 17 and 5 days for PH, AN, FL and PY. The combined use of exposure chamber measurements and models provides a robust tool for simultaneously measuring several different transfer factors that are important for modeling the uptake of pollutants into vegetation.« less

  18. Metal uptake by homegrown vegetables – The relative importance in human health risk assessments at contaminated sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustsson, Anna L.M., E-mail: anna.augustsson@lnu.se; Uddh-Söderberg, Terese E.; Hogmalm, K. Johan

    Risk assessments of contaminated land often involve the use of generic bioconcentration factors (BCFs), which express contaminant concentrations in edible plant parts as a function of the concentration in soil, in order to assess the risks associated with consumption of homegrown vegetables. This study aimed to quantify variability in BCFs and evaluate the implications of this variability for human exposure assessments, focusing on cadmium (Cd) and lead (Pb) in lettuce and potatoes sampled around 22 contaminated glassworks sites. In addition, risks associated with measured Cd and Pb concentrations in soil and vegetable samples were characterized and a probabilistic exposure assessmentmore » was conducted to estimate the likelihood of local residents exceeding tolerable daily intakes. The results show that concentrations in vegetables were only moderately elevated despite high concentrations in soil, and most samples complied with applicable foodstuff legislation. Still, the daily intake of Cd (but not Pb) was assessed to exceed toxicological thresholds for about a fifth of the study population. Bioconcentration factors were found to vary more than indicated by previous studies, but decreasing BCFs with increasing metal concentrations in the soil can explain why the calculated exposure is only moderately affected by the choice of BCF value when generic soil guideline values are exceeded and the risk may be unacceptable. - Highlights: • Uptake of Cd and Pb by lettuce and potatoes increased with soil contamination. • Consumption of homegrown vegetables may lead to a daily Cd intake above TDIs. • The variability in the calculated BCFs is high when compared to previous studies. • Exposure assessments are most sensitive to the choice of BCFs at low contamination.« less

  19. Exposure assessment of dioxins and dioxin-like PCBs in pasteurised bovine milk using probabilistic modelling.

    PubMed

    Adekunte, Adefunke O; Tiwari, Brijesh K; O'Donnell, Colm P

    2010-09-01

    Quantitative exposure assessment is a useful technique to investigate the risk from contaminants in the food chain. The objective of this study was to develop a probabilistic exposure assessment model for dioxins (PCDD/Fs) and dioxin-like PCBs (DL-PCBs) in pasteurised bovine milk. Mean dioxins and DL-PCBs (non-ortho and mono-ortho PCBs) concentrations (pg WHO-TEQ g(-1)) in bovine milk were estimated as 0.06 ± 0.07 pg WHO-TEQ g(-1) for dioxins and 0.08 ± 0.07 pg WHO-TEQ g(-1) for DL-PCBs using Monte Carlo simulation. The simulated model estimated mean exposure for dioxins was 0.19 ± 0.29 pg WHO-TEQ kg(-1)bw d(-1) and 0.14 ± 0.22 pg WHO-TEQ kg(-1) bw d(-1) and for DL-PCBs was 0.25 ± 0.30 pg WHO-TEQ kg(-1) bw d(-1) and 0.19 ± 0.22 pg WHO-TEQ kg(-1) bw d(-1) for men and women, respectively. This study showed that the mean dioxins and DL-PCBs exposure from consumption of pasteurised bovine milk is below the provisional maximum tolerable monthly intake of 70 pg TEQ kg(-1) bw month(-1) (equivalent of 2.3 pg TEQ kg(-1) bw d(-1)) recommended by the Joint FAO/WHO Expert Committee on Food Additives and Contaminants (JECFA). Results from this study also showed that the estimated dioxins and DL-PCBs concentration in pasteurised bovine milk is comparable to those reported in previous studies. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Anemia risk in relation to lead exposure in lead-related manufacturing.

    PubMed

    Hsieh, Nan-Hung; Chung, Shun-Hui; Chen, Szu-Chieh; Chen, Wei-Yu; Cheng, Yi-Hsien; Lin, Yi-Jun; You, Su-Han; Liao, Chung-Min

    2017-05-05

    Lead-exposed workers may suffer adverse health effects under the currently regulated blood lead (BPb) levels. However, a probabilistic assessment about lead exposure-associated anemia risk is lacking. The goal of this study was to examine the association between lead exposure and anemia risk among factory workers in Taiwan. We first collated BPb and indicators of hematopoietic function data via health examination records that included 533 male and 218 female lead-exposed workers between 2012 and 2014. We used benchmark dose (BMD) modeling to estimate the critical effect doses for detection of abnormal indicators. A risk-based probabilistic model was used to characterize the potential hazard of lead poisoning for job-specific workers by hazard index (HI). We applied Bayesian decision analysis to determine whether BMD could be implicated as a suitable BPb standard. Our results indicated that HI for total lead-exposed workers was 0.78 (95% confidence interval: 0.50-1.26) with risk occurrence probability of 11.1%. The abnormal risk of anemia indicators for male and female workers could be reduced, respectively, by 67-77% and 86-95% by adopting the suggested BPb standards of 25 and 15 μg/dL. We conclude that cumulative exposure to lead in the workplace was significantly associated with anemia risk. This study suggests that current BPb standard needs to be better understood for the application of lead-exposed population protection in different scenarios to provide a novel standard for health management. Low-level lead exposure risk is an occupational and public health problem that should be paid more attention.

  1. Use of adjoint methods in the probabilistic finite element approach to fracture mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing Kam; Besterfield, Glen; Lawrence, Mark; Belytschko, Ted

    1988-01-01

    The adjoint method approach to probabilistic finite element methods (PFEM) is presented. When the number of objective functions is small compared to the number of random variables, the adjoint method is far superior to the direct method in evaluating the objective function derivatives with respect to the random variables. The PFEM is extended to probabilistic fracture mechanics (PFM) using an element which has the near crack-tip singular strain field embedded. Since only two objective functions (i.e., mode I and II stress intensity factors) are needed for PFM, the adjoint method is well suited.

  2. Probabilistic Assessment of Radiation Risk for Astronauts in Space Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; DeAngelis, Giovanni; Cucinotta, Francis A.

    2009-01-01

    Accurate predictions of the health risks to astronauts from space radiation exposure are necessary for enabling future lunar and Mars missions. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons, (less than 100 MeV); and galactic cosmic rays (GCR), which include protons and heavy ions of higher energies. While the expected frequency of SPEs is strongly influenced by the solar activity cycle, SPE occurrences themselves are random in nature. A solar modulation model has been developed for the temporal characterization of the GCR environment, which is represented by the deceleration potential, phi. The risk of radiation exposure from SPEs during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern for radiation protection, including determining the shielding and operational requirements for astronauts and hardware. To support the probabilistic risk assessment for EVAs, which would be up to 15% of crew time on lunar missions, we estimated the probability of SPE occurrence as a function of time within a solar cycle using a nonhomogeneous Poisson model to fit the historical database of measurements of protons with energy > 30 MeV, (phi)30. The resultant organ doses and dose equivalents, as well as effective whole body doses for acute and cancer risk estimations are analyzed for a conceptual habitat module and a lunar rover during defined space mission periods. This probabilistic approach to radiation risk assessment from SPE and GCR is in support of mission design and operational planning to manage radiation risks for space exploration.

  3. Use of limited data to construct Bayesian networks for probabilistic risk assessment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Swiler, Laura Painton

    2013-03-01

    Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less

  4. A Targeted Health Risk Assessment Following the Deepwater Horizon Oil Spill: Polycyclic Aromatic Hydrocarbon Exposure in Vietnamese-American Shrimp Consumers

    PubMed Central

    Frickel, Scott; Nguyen, Daniel; Bui, Tap; Echsner, Stephen; Simon, Bridget R.; Howard, Jessi L.; Miller, Kent; Wickliffe, Jeffrey K.

    2014-01-01

    Background: The Deepwater Horizon oil spill of 2010 prompted concern about health risks among seafood consumers exposed to polycyclic aromatic hydrocarbons (PAHs) via consumption of contaminated seafood. Objective: The objective of this study was to conduct population-specific probabilistic health risk assessments based on consumption of locally harvested white shrimp (Litopenaeus setiferus) among Vietnamese Americans in southeast Louisiana. Methods: We conducted a survey of Vietnamese Americans in southeast Louisiana to evaluate shrimp consumption, preparation methods, and body weight among shrimp consumers in the disaster-impacted region. We also collected and chemically analyzed locally harvested white shrimp for 81 individual PAHs. We combined the PAH levels (with accepted reference doses) found in the shrimp with the survey data to conduct Monte Carlo simulations for probabilistic noncancer health risk assessments. We also conducted probabilistic cancer risk assessments using relative potency factors (RPFs) to estimate cancer risks from the intake of PAHs from white shrimp. Results: Monte Carlo simulations were used to generate hazard quotient distributions for noncancer health risks, reported as mean ± SD, for naphthalene (1.8 × 10–4 ± 3.3 × 10–4), fluorene (2.4 × 10–5 ± 3.3 × 10–5), anthracene (3.9 × 10–6 ± 5.4 × 10–6), pyrene (3.2 × 10–5 ± 4.3 × 10–5), and fluoranthene (1.8 × 10–4 ± 3.3 × 10–4). A cancer risk distribution, based on RPF-adjusted PAH intake, was also generated (2.4 × 10–7 ± 3.9 × 10–7). Conclusions: The risk assessment results show no acute health risks or excess cancer risk associated with consumption of shrimp containing the levels of PAHs detected in our study, even among frequent shrimp consumers. Citation: Wilson MJ, Frickel S, Nguyen D, Bui T, Echsner S, Simon BR, Howard JL, Miller K, Wickliffe JK. 2015. A targeted health risk assessment following the Deepwater Horizon Oil Spill: polycyclic aromatic hydrocarbon exposure in Vietnamese-American shrimp consumers. Environ Health Perspect 123:152–159; http://dx.doi.org/10.1289/ehp.1408684 PMID:25333566

  5. Explaining differences between bioaccumulation measurements in laboratory and field data through use of a probabilistic modeling approach.

    PubMed

    Selck, Henriette; Drouillard, Ken; Eisenreich, Karen; Koelmans, Albert A; Palmqvist, Annemette; Ruus, Anders; Salvito, Daniel; Schultz, Irv; Stewart, Robin; Weisbrod, Annie; van den Brink, Nico W; van den Heuvel-Greve, Martine

    2012-01-01

    In the regulatory context, bioaccumulation assessment is often hampered by substantial data uncertainty as well as by the poorly understood differences often observed between results from laboratory and field bioaccumulation studies. Bioaccumulation is a complex, multifaceted process, which calls for accurate error analysis. Yet, attempts to quantify and compare propagation of error in bioaccumulation metrics across species and chemicals are rare. Here, we quantitatively assessed the combined influence of physicochemical, physiological, ecological, and environmental parameters known to affect bioaccumulation for 4 species and 2 chemicals, to assess whether uncertainty in these factors can explain the observed differences among laboratory and field studies. The organisms evaluated in simulations including mayfly larvae, deposit-feeding polychaetes, yellow perch, and little owl represented a range of ecological conditions and biotransformation capacity. The chemicals, pyrene and the polychlorinated biphenyl congener PCB-153, represented medium and highly hydrophobic chemicals with different susceptibilities to biotransformation. An existing state of the art probabilistic bioaccumulation model was improved by accounting for bioavailability and absorption efficiency limitations, due to the presence of black carbon in sediment, and was used for probabilistic modeling of variability and propagation of error. Results showed that at lower trophic levels (mayfly and polychaete), variability in bioaccumulation was mainly driven by sediment exposure, sediment composition and chemical partitioning to sediment components, which was in turn dominated by the influence of black carbon. At higher trophic levels (yellow perch and the little owl), food web structure (i.e., diet composition and abundance) and chemical concentration in the diet became more important particularly for the most persistent compound, PCB-153. These results suggest that variation in bioaccumulation assessment is reduced most by improved identification of food sources as well as by accounting for the chemical bioavailability in food components. Improvements in the accuracy of aqueous exposure appear to be less relevant when applied to moderate to highly hydrophobic compounds, because this route contributes only marginally to total uptake. The determination of chemical bioavailability and the increase in understanding and qualifying the role of sediment components (black carbon, labile organic matter, and the like) on chemical absorption efficiencies has been identified as a key next steps. Copyright © 2011 SETAC.

  6. Risk assessment for furan contamination through the food chain in Belgian children.

    PubMed

    Scholl, Georges; Huybrechts, Inge; Humblet, Marie-France; Scippo, Marie-Louise; De Pauw, Edwin; Eppe, Gauthier; Saegerman, Claude

    2012-08-01

    Young, old, pregnant and immuno-compromised persons are of great concern for risk assessors as they represent the sub-populations most at risk. The present paper focuses on risk assessment linked to furan exposure in children. Only the Belgian population was considered because individual contamination and consumption data that are required for accurate risk assessment were available for Belgian children only. Two risk assessment approaches, the so-called deterministic and probabilistic, were applied and the results were compared for the estimation of daily intake. A significant difference between the average Estimated Daily Intake (EDI) was underlined between the deterministic (419 ng kg⁻¹ body weight (bw) day⁻¹) and the probabilistic (583 ng kg⁻¹ bw day⁻¹) approaches, which results from the mathematical treatment of the null consumption and contamination data. The risk was characterised by two ways: (1) the classical approach by comparison of the EDI to a reference dose (RfD(chronic-oral)) and (2) the most recent approach, namely the Margin of Exposure (MoE) approach. Both reached similar conclusions: the risk level is not of a major concern, but is neither negligible. In the first approach, only 2.7 or 6.6% (respectively in the deterministic and in the probabilistic way) of the studied population presented an EDI above the RfD(chronic-oral). In the second approach, the percentage of children displaying a MoE above 10,000 and below 100 is 3-0% and 20-0.01% in the deterministic and probabilistic modes, respectively. In addition, children were compared to adults and significant differences between the contamination patterns were highlighted. While major contamination was linked to coffee consumption in adults (55%), no item predominantly contributed to the contamination in children. The most important were soups (19%), dairy products (17%), pasta and rice (11%), fruit and potatoes (9% each).

  7. Estimated lead (Pb) exposures for a population of urban community gardeners.

    PubMed

    Spliethoff, Henry M; Mitchell, Rebecca G; Shayler, Hannah; Marquez-Bravo, Lydia G; Russell-Anelli, Jonathan; Ferenz, Gretchen; McBride, Murray

    2016-08-01

    Urban community gardens provide affordable, locally grown, healthy foods and many other benefits. However, urban garden soils can contain lead (Pb) that may pose risks to human health. To help evaluate these risks, we measured Pb concentrations in soil, vegetables, and chicken eggs from New York City community gardens, and we asked gardeners about vegetable consumption and time spent in the garden. We then estimated Pb intakes deterministically and probabilistically for adult gardeners, children who spend time in the garden, and adult (non-gardener) household members. Most central tendency Pb intakes were below provisional total tolerable intake (PTTI) levels. High contact intakes generally exceeded PTTIs. Probabilistic estimates showed approximately 40 % of children and 10 % of gardeners exceeding PTTIs. Children's exposure came primarily from dust ingestion and exposure to higher Pb soil between beds. Gardeners' Pb intakes were comparable to children's (in µg/day) but were dominated by vegetable consumption. Adult household members ate less garden-grown produce than gardeners and had the lowest Pb intakes. Our results suggest that healthy gardening practices to reduce Pb exposure in urban community gardens should focus on encouraging cultivation of lower Pb vegetables (i.e., fruits) for adult gardeners and on covering higher Pb non-bed soils accessible to young children. However, the common practice of replacement of root-zone bed soil with clean soil (e.g., in raised beds) has many benefits and should also continue to be encouraged.

  8. Estimated lead (Pb) exposures for a population of urban community gardeners

    PubMed Central

    Spliethoff, Henry M.; Mitchell, Rebecca G.; Shayler, Hannah; Marquez-Bravo, Lydia G.; Russell-Anelli, Jonathan; Ferenz, Gretchen; McBride, Murray

    2016-01-01

    Urban community gardens provide affordable, locally grown, healthy foods and many other benefits. However, urban garden soils can contain lead (Pb) that may pose risks to human health. To help evaluate these risks, we measured Pb concentrations in soil, vegetables, and chicken eggs from New York City community gardens, and we asked gardeners about vegetable consumption and time spent in the garden. We then estimated Pb intakes deterministically and probabilistically for adult gardeners, children who spend time in the garden, and adult (non-gardener) household members. Most central-tendency Pb intakes were below provisional total tolerable intake (PTTI) levels. High-contact intakes generally exceeded PTTIs. Probabilistic estimates showed approximately 40% of children and 10% of gardeners exceeding PTTIs. Children’s exposure came primarily from dust ingestion and exposure to higher-Pb soil between beds. Gardeners’ Pb intakes were comparable to children’s (in µg/d) but were dominated by vegetable consumption. Adult household members ate less garden-grown produce than gardeners and had the lowest Pb intakes. Our results suggest that healthy gardening practices to reduce Pb exposure in urban community gardens should focus on encouraging cultivation of lower-Pb vegetables (i.e., fruits) for adult gardeners and on covering higher-Pb non-bed soils accessible to young children. However, the common practice of replacement of root-zone bed soil with clean soil (e.g., in raised beds) has many benefits and should also continue to be encouraged. PMID:26753554

  9. Probabilistic health risk assessment for ingestion of seafood farmed in arsenic contaminated groundwater in Taiwan.

    PubMed

    Liang, Ching-Ping; Jang, Cheng-Shin; Chen, Jui-Sheng; Wang, Sheng-Wei; Lee, Jin-Jing; Liu, Chen-Wuing

    2013-08-01

    Seafood farmed in arsenic (As)-contaminated areas is a major exposure pathway for the ingestion of inorganic As by individuals in the southwestern part of Taiwan. This study presents a probabilistic risk assessment using limited data for inorganic As intake through the consumption of the seafood by local residents in these areas. The As content and the consumption rate are both treated as probability distributions, taking into account the variability of the amount in the seafood and individual consumption habits. The Monte Carlo simulation technique is utilized to conduct an assessment of exposure due to the daily intake of inorganic As from As-contaminated seafood. Exposure is evaluated according to the provisional tolerable weekly intake (PTWI) established by the FAO/WHO and the target risk based on the US Environmental Protection Agency guidelines. The assessment results show that inorganic As intake from five types of fish (excluding mullet) and shellfish fall below the PTWI threshold values for the 95th percentiles, but exceed the target cancer risk of 10(-6). The predicted 95th percentile for inorganic As intake and lifetime cancer risks obtained in the study are both markedly higher than those obtained in previous studies in which the consumption rate of seafood considered is a deterministic value. This study demonstrates the importance of the individual variability of seafood consumption when evaluating a high exposure sub-group of the population who eat higher amounts of fish and shellfish than the average Taiwanese.

  10. Ecological risk assessment of zinc from stormwater runoff to an aquatic ecosystem.

    PubMed

    Brix, Kevin V; Keithly, James; Santore, Robert C; DeForest, David K; Tobiason, Scott

    2010-03-15

    Zinc (Zn) risks from stormwater runoff to an aquatic ecosystem were studied. Monitoring data on waterborne, porewater, and sediment Zn concentrations collected at 20 stations throughout a stormwater collection/detention facility consisting of forested wetlands, a retention pond and first order stream were used to conduct the assessment. Bioavailability in the water column was estimated using biotic ligand models for invertebrates and fish while bioavailability in the sediment was assessed using acid volatile sulfide-simultaneously extracted metal (AVS-SEM). The screening level assessment indicated no significant risks were posed to benthic organisms from Zn concentrations in sediments and pore water. As would be expected for stormwater, Zn concentrations were temporally quite variable within a storm event, varying by factors of 2 to 4. Overall, probabilistic assessment indicated low (5-10% of species affected) to negligible risks in the system, especially at the discharge to the first order stream. Moderate to high risks (10-50% of species affected) were identified at sampling locations most upgradient in the collection system. The largest uncertainty with the assessment is associated with how best to estimate chronic exposure/risks from time-varying exposure concentrations. Further research on pulse exposure metal toxicity is clearly needed to assess stormwater impacts on the environment.

  11. Probabilistic modeling of discourse-aware sentence processing.

    PubMed

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  12. An In Vitro Assessment of Bioaccessibility of Arsenicals in Rice and the Use of this Estimate within a Probabilistic Exposure Model

    EPA Science Inventory

    In this study, an in vitro synthetic gastrointestinal extraction protocol was used to estimate bioaccessibility of different arsenicals present in seventeen rice samples of various grain types that were collected across the US. The across matrix average for total arsenic was 209...

  13. Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats

    PubMed Central

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2015-01-01

    Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448

  14. [Causation in the court: the complex case of malignant mesothelioma].

    PubMed

    Lageard, Giovanni

    2011-01-01

    The aim of this paper is to carry out an analysis of the legal evolution in Italy of the assessment of causation i.e. cause and effect, in oncological diseases, a question taken into consideration by the High Court almost exclusively with reference to pleural mesothelioma. The most debated question when defining the causal association between asbestos exposure and mesothelioma is the possible role that any multiple potentially causative exposures could assume in the induction and development of the disease, and in particular the role of any asbestos exposure over the successive employment periods. Indeed, this is a subject on which, to date, no agreement has yet been reached in scientific doctrine: these divergences bear important practical significance from a legal point of view, since sustaining one thesis or another may constitute determining factors when ascertaining responsibility for individuals who, in the past, had decisional statuses in the workplace. Jurisprudence in the High Court took on an oscillating position on this question as from the early 2000s, which was divided into those who sustained the thesis of the relevance of any asbestos exposure over the successive employment periods and those who were of a different opinion, i.e. only the first exposure period has relevant causative effect. The point under discussion concerns, in particular, the adequacy of a probabilistic law only governing such a question. An important turning point was made in the year 2010 when two sentences were announced in the High Court, reiterating, in strict compliance with the principles affirmed by the United Sections in 2002, that a judge cannot, and must not, be satisfied with a general causation, but must rather reach a judgment on the basis of an individual causation. In particular, not only did the second of these two sentences recognise the multifactorial nature of mesothelioma, something which had almost always been denied in jurisprudence in the past, but it also established some very clear legal principles of law. Essentially, when ascertaining the causation, a judge should verify whether or not there is a sufficiently well established scientific law covering the question and whether such a law is universal or probabilistic. Should the latter be the case, then it is necessary to establish if the accelerating effect has been determined in the case in question, on the basis of the factual acquisitions. We must now wait for the concrete application of these principles by juridical bodies.

  15. Probabilistic teleportation via multi-parameter measurements and partially entangled states

    NASA Astrophysics Data System (ADS)

    Wei, Jiahua; Shi, Lei; Han, Chen; Xu, Zhiyan; Zhu, Yu; Wang, Gang; Wu, Hao

    2018-04-01

    In this paper, a novel scheme for probabilistic teleportation is presented with multi-parameter measurements via a non-maximally entangled state. This is in contrast to the fact that the measurement kinds for quantum teleportation are usually particular in most previous schemes. The detail implementation producers for our proposal are given by using of appropriate local unitary operations. Moreover, the total success probability and classical information of this proposal are calculated. It is demonstrated that the success probability and classical cost would be changed with the multi-measurement parameters and the entanglement factor of quantum channel. Our scheme could enlarge the research range of probabilistic teleportation.

  16. Probabilistic risk assessment of the effect of acidified seawater on development stages of sea urchin (Strongylocentrotus droebachiensis).

    PubMed

    Chen, Wei-Yu; Lin, Hsing-Chieh

    2018-05-01

    Growing evidence indicates that ocean acidification has a significant impact on calcifying marine organisms. However, there is a lack of exposure risk assessments for aquatic organisms under future environmentally relevant ocean acidification scenarios. The objective of this study was to investigate the probabilistic effects of acidified seawater on the life-stage response dynamics of fertilization, larvae growth, and larvae mortality of the green sea urchin (Strongylocentrotus droebachiensis). We incorporated the regulation of primary body cavity (PBC) pH in response to seawater pH into the assessment by constructing an explicit model to assess effective life-stage response dynamics to seawater or PBC pH levels. The likelihood of exposure to ocean acidification was also evaluated by addressing the uncertainties of the risk characterization. For unsuccessful fertilization, the estimated 50% effect level of seawater acidification (EC50 SW ) was 0.55 ± 0.014 (mean ± SE) pH units. This life stage was more sensitive than growth inhibition and mortality, for which the EC50 values were 1.13 and 1.03 pH units, respectively. The estimated 50% effect levels of PBC pH (EC50 PBC ) were 0.99 ± 0.05 and 0.88 ± 0.006 pH units for growth inhibition and mortality, respectively. We also predicted the probability distributions for seawater and PBC pH levels in 2100. The level of unsuccessful fertilization had 50 and 90% probability risks of 5.07-24.51 (95% CI) and 0-6.95%, respectively. We conclude that this probabilistic risk analysis model is parsimonious enough to quantify the multiple vulnerabilities of the green sea urchin while addressing the systemic effects of ocean acidification. This study found a high potential risk of acidification affecting the fertilization of the green sea urchin, whereas there was no evidence for adverse effects on growth and mortality resulting from exposure to the predicted acidified environment.

  17. Effect of Cyclic Thermo-Mechanical Loads on Fatigue Reliability in Polymer Matrix Composites

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Murthy, P. L. N.; Chamis, C. C.

    1996-01-01

    A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multi-factor interaction relationship developed at NASA Lewis Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability- based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)(sub s) graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  18. Probabilistic modelling to assess exposure to three artificial sweeteners of young Irish patients aged 1-3 years with PKU and CMPA.

    PubMed

    O'Sullivan, Aaron J; Pigat, Sandrine; O'Mahony, Cian; Gibney, Michael J; McKevitt, Aideen I

    2016-11-01

    The choice of suitable normal foods is limited for individuals with particular medical conditions, e.g., inborn errors of metabolism (phenylketonuria - PKU) or severe cow's milk protein allergy (CMPA). Patients may have dietary restrictions and exclusive or partial replacement of specific food groups with specially formulated products to meet particular nutrition requirements. Artificial sweeteners are used to improve the appearance and palatability of such food products to avoid food refusal and ensure dietary adherence. Young children have a higher risk of exceeding acceptable daily intakes for additives than adults due to higher food intakes kg -1 body weight. The Budget Method and EFSA's Food Additives Intake Model (FAIM) are not equipped to assess partial dietary replacement with special formulations as they are built on data from dietary surveys of consumers without special medical requirements impacting the diet. The aim of this study was to explore dietary exposure modelling as a means of estimating the intake of artificial sweeteners by young PKU and CMPA patients aged 1-3 years. An adapted validated probabilistic model (FACET) was used to assess patients' exposure to artificial sweeteners. Food consumption data were derived from the food consumption survey data of healthy young children in Ireland from the National Preschool and Nutrition Survey (NPNS, 2010-11). Specially formulated foods for special medical purposes were included in the exposure model to replace restricted foods. Inclusion was based on recommendations for adequate protein intake and dietary adherence data. Exposure assessment results indicated that young children with PKU and CMPA have higher relative average intakes of artificial sweeteners than healthy young children. The reliability and robustness of the model in the estimation of patient additive exposures was further investigated and provides the first exposure estimates for these special populations.

  19. Metal induced inhalation exposure in urban population: A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Widziewicz, Kamila; Loska, Krzysztof

    2016-03-01

    The paper was aimed at assessing the health risk in the populations of three Silesian cities: Bielsko-Biała, Częstochowa and Katowice exposed to the inhalation intake of cadmium, nickel and arsenic present in airborne particulate matter. In order to establish how the exposure parameters affects risk a probabilistic risk assessment framework was used. The risk model was based on the results of the annual measurements of As, Cd and Ni concentrations in PM2.5 and the sets of data on the concentrations of those elements in PM10 collected by the Voivodship Inspectorate of Environmental Protection over 2012-2013 period. The risk was calculated as an incremental lifetime risk of cancer (ILCR) in particular age groups (infants, children, adults) following Monte Carlo approach. With the aim of depicting the effect the variability of exposure parameters exerts on the risk, the initial parameters of the risk model: metals concentrations, its infiltration into indoor environment, exposure duration, exposure frequency, lung deposition efficiency, daily lung ventilation and body weight were modeled as random variables. The distribution of inhalation cancer risk due to exposure to ambient metals concentrations was LN (1.80 × 10-6 ± 2.89 × 10-6) and LN (6.17 × 10-7 ± 1.08 × 10-6) for PM2.5 and PM10-bound metals respectively and did not exceed the permissible limit of the acceptable risk. The highest probability of contracting cancer was observed for Katowice residents exposed to PM2.5 - LN (2.01 × 10-6 ± 3.24 × 10-6). Across the tested age groups adults were approximately one order of magnitude at higher risk compared to infants. Sensitivity analysis showed that exposure duration (ED) and body weight (BW) were the two variables, which contributed the most to the ILCR.

  20. Protocol and Demonstrations of Probabilistic Reliability Assessment for Structural Health Monitoring Systems (Preprint)

    DTIC Science & Technology

    2011-11-01

    assessment to quality of localization/characterization estimates. This protocol includes four critical components: (1) a procedure to identify the...critical factors impacting SHM system performance; (2) a multistage or hierarchical approach to SHM system validation; (3) a model -assisted evaluation...Lindgren, E. A ., Buynak, C. F., Steffes, G., Derriso, M., “ Model -assisted Probabilistic Reliability Assessment for Structural Health Monitoring

  1. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to decrease future risks. Preliminary results show that the urban extent in Indonesia is projected to increase within 211 to 351% over the period 2000-2030 (5 and 95 percentile). Mainly driven by this rapid urbanization, potential flood losses in Indonesia increase rapidly and are primarily concentrated on the island of Java. The results reveal the large risk-reducing potential of adaptation measures. Since much of the urban development between 2000 and 2030 takes place in flood-prone areas, strategic urban planning (i.e. building in safe areas) may significantly reduce the urban population and infrastructure exposed to flooding. We conclude that a probabilistic risk approach in future flood risk assessment is vital; the drivers behind risk trends (exposure, hazard, vulnerability) should be understood to develop robust and efficient adaptation pathways.

  2. Stratified probabilistic bias analysis for BMI-related exposure misclassification in postmenopausal women.

    PubMed

    Banack, Hailey R; Stokes, Andrew; Fox, Matthew P; Hovey, Kathleen M; Cespedes-Feliciano, Elizabeth M; LeBlanc, Erin; Bird, Chloe; Caan, Bette J; Kroenke, Candyce H; Allison, Matthew A; Going, Scott B; Snetslaar, Linda; Cheng, Ting-Yuan David; Chlebowski, Rowan T; Stefanick, Marcia L; LaMonte, Michael J; Wactawski-Wende, Jean

    2018-06-01

    There is widespread concern about the use of body mass index (BMI) to define obesity status in postmenopausal women because it may not accurately represent an individual's true obesity status. The objective of the present study is to examine and adjust for exposure misclassification bias from using an indirect measure of obesity (BMI) compared with a direct measure of obesity (percent body fat). We used data from postmenopausal non-Hispanic black and non-Hispanic white women in the Women's Health Initiative (WHI; n=126,459). Within the WHI, a sample of 11,018 women were invited to participate in a sub-study involving dual-energy x-ray absorptiometry (DXA) scans. We examined indices of validity comparing BMI-defined obesity (≥30kg/m) with obesity defined by percent body fat. We then used probabilistic bias analysis models stratified by age and race to explore the effect of exposure misclassification on the obesity-mortality relationship. Validation analyses highlight that using a BMI cutpoint of 30 kg/m to define obesity in postmenopausal women is associated with poor validity. There were notable differences in sensitivity by age and race. Results from the stratified bias analysis demonstrated that failing to adjust for exposure misclassification bias results in attenuated estimates of the obesity-mortality relationship. For example, in non-Hispanic white women age 50-59, the conventional risk difference was 0.017 (95% CI 0.01, 0.023) and the bias-adjusted risk difference was 0.035 (95% SI 0.028, 0.043). These results demonstrate the importance of using quantitative bias analysis techniques to account for non-differential exposure misclassification of BMI-defined obesity.

  3. Alternate Methods in Refining the SLS Nozzle Plug Loads

    NASA Technical Reports Server (NTRS)

    Burbank, Scott; Allen, Andrew

    2013-01-01

    Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.

  4. A probabilistic approach to radiative energy loss calculations for optically thick atmospheres - Hydrogen lines and continua

    NASA Technical Reports Server (NTRS)

    Canfield, R. C.; Ricchiazzi, P. J.

    1980-01-01

    An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.

  5. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    PubMed

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Discounting of food, sex, and money.

    PubMed

    Holt, Daniel D; Newquist, Matthew H; Smits, Rochelle R; Tiry, Andrew M

    2014-06-01

    Discounting is a useful framework for understanding choice involving a range of delayed and probabilistic outcomes (e.g., money, food, drugs), but relatively few studies have examined how people discount other commodities (e.g., entertainment, sex). Using a novel discounting task, where the length of a line represented the value of an outcome and was adjusted using a staircase procedure, we replicated previous findings showing that individuals discount delayed and probabilistic outcomes in a manner well described by a hyperbola-like function. In addition, we found strong positive correlations between discounting rates of delayed, but not probabilistic, outcomes. This suggests that discounting of delayed outcomes may be relatively predictable across outcome types but that discounting of probabilistic outcomes may depend more on specific contexts. The generality of delay discounting and potential context dependence of probability discounting may provide important information regarding factors contributing to choice behavior.

  7. Quantifying and Adjusting for Disease Misclassification Due to Loss to Follow-Up in Historical Cohort Mortality Studies.

    PubMed

    Scott, Laura L F; Maldonado, George

    2015-10-15

    The purpose of this analysis was to quantify and adjust for disease misclassification from loss to follow-up in a historical cohort mortality study of workers where exposure was categorized as a multi-level variable. Disease classification parameters were defined using 2008 mortality data for the New Zealand population and the proportions of known deaths observed for the cohort. The probability distributions for each classification parameter were constructed to account for potential differences in mortality due to exposure status, gender, and ethnicity. Probabilistic uncertainty analysis (bias analysis), which uses Monte Carlo techniques, was then used to sample each parameter distribution 50,000 times, calculating adjusted odds ratios (ORDM-LTF) that compared the mortality of workers with the highest cumulative exposure to those that were considered never-exposed. The geometric mean ORDM-LTF ranged between 1.65 (certainty interval (CI): 0.50-3.88) and 3.33 (CI: 1.21-10.48), and the geometric mean of the disease-misclassification error factor (εDM-LTF), which is the ratio of the observed odds ratio to the adjusted odds ratio, had a range of 0.91 (CI: 0.29-2.52) to 1.85 (CI: 0.78-6.07). Only when workers in the highest exposure category were more likely than those never-exposed to be misclassified as non-cases did the ORDM-LTF frequency distributions shift further away from the null. The application of uncertainty analysis to historical cohort mortality studies with multi-level exposures can provide valuable insight into the magnitude and direction of study error resulting from losses to follow-up.

  8. Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.

    Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small imagemore » patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.« less

  9. Contaminant deposition building shielding factors for US residential structures.

    PubMed

    Dickson, Elijah; Hamby, David; Eckerman, Keith

    2017-10-10

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit. © 2017 IOP Publishing Ltd.

  10. Contaminant deposition building shielding factors for US residential structures.

    PubMed

    Dickson, E D; Hamby, D M; Eckerman, K F

    2015-06-01

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit.

  11. A PROBABILISTIC ARSENIC EXPOSURE ASSESSMENT FOR CHILDREN WHO CONTACT CAA - TREATED PLAYSETS AND DECKS: PART 1. MODEL METHODOLOGY, VARIABILITY RESULTS, AND MODEL EVALUATION

    EPA Science Inventory

    Concerns have been raised regarding the safety of young children who may contact arsenic residues while playing on and around chromated copper arsenate (CCA)-treated wood playsets and decks. Although CCA registrants voluntarily canceled the production of treated wood for residen...

  12. COMPARISON OF FIELD MEASUREMENTS FROM A CHILDREN'S PESTICIDE STUDY AGAINST PREDICTIONS FROM A PHYSICALLY BASED PROBABILISTIC MODEL FOR ESTIMATING CHILDREN'S RESIDENTIAL EXPOSURE AND DOSE TO CHLORPYRIFOS

    EPA Science Inventory

    Semi-volatile pesticides, such as chlorpyrifos, can move about within a home environment after an application due to physical/chemical processes, resulting in concentration loadings in and on objects and surfaces. Children can be particularly susceptible to the effects of pest...

  13. U.S. EPA TEAM STUDY OF INHALABLE PARTICLES (PM-10): STUDY DESIGN, RESPONSE RATE, AND SAMPLER PERFORMANCE

    EPA Science Inventory

    The US EPA studied the exposures of 175 residents of Riverside, CA to inha1able particles (<10 u diameter) in the early fall of 1990. Participants were probabilistically selected to represent most of the Riverside nonsmoking population over the age of 10. They wore a newly-design...

  14. User's Manual for RESRAD-OFFSITE Version 2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; Gnanapragasam, E.; Biwer, B. M.

    2007-09-05

    The RESRAD-OFFSITE code is an extension of the RESRAD (onsite) code, which has been widely used for calculating doses and risks from exposure to radioactively contaminated soils. The development of RESRAD-OFFSITE started more than 10 years ago, but new models and methodologies have been developed, tested, and incorporated since then. Some of the new models have been benchmarked against other independently developed (international) models. The databases used have also expanded to include all the radionuclides (more than 830) contained in the International Commission on Radiological Protection (ICRP) 38 database. This manual provides detailed information on the design and application ofmore » the RESRAD-OFFSITE code. It describes in detail the new models used in the code, such as the three-dimensional dispersion groundwater flow and radionuclide transport model, the Gaussian plume model for atmospheric dispersion, and the deposition model used to estimate the accumulation of radionuclides in offsite locations and in foods. Potential exposure pathways and exposure scenarios that can be modeled by the RESRAD-OFFSITE code are also discussed. A user's guide is included in Appendix A of this manual. The default parameter values and parameter distributions are presented in Appendix B, along with a discussion on the statistical distributions for probabilistic analysis. A detailed discussion on how to reduce run time, especially when conducting probabilistic (uncertainty) analysis, is presented in Appendix C of this manual.« less

  15. Revised methods for estimating potential reentry exposure associated with indoor crack and crevice and perimeter application.

    PubMed

    Driver, Jeffrey H; Ross, John H; Pandian, Muhilan; Selim, Sami; Sharp, Janice K

    2013-01-01

    Surface deposition of insecticides applied as indoor residential foggers, baseboard or perimeter sprays, spot sprays, and crack-and-crevice (C&C) sprays represent pathways of unintentional, postapplication exposure for children and adults. Estimation of the magnitude of this exposure following an application event is associated with uncertainty due to many factors, including (1) surface residue deposition and distribution, (2) access to and the nature of contact with treated surfaces based on time-activity patterns of residents, and (3) the role of residue removal mechanisms such as cleaning treated surfaces, pesticide degradation or redistribution, and hand washing and bathing following contact. A comparative spatial deposition study was conducted involving broadcast, perimeter, and C&C application methods. Residues measured using a spatial grid of deposition dosimeters on floor surfaces demonstrated significantly lower residue concentrations in readily accessible areas following C&C and perimeter applications, versus broadcast treatment. Analyses of other monitoring studies support this finding. The implications of these findings are discussed for both screening-level and higher tier probabilistic postapplication, residential exposure assessment. The U.S. Environmental Protection Agency (EPA) current guidance on interpretation of deposition following C&C application is supported by data in this study and others that indicate a ratio of 10:1 for deposition for broadcast versus C&C application. However, the perimeter deposition data are quite similar to C&C deposition and do not support a 70/30 default relative to broadcast recommended by the U.S. EPA (2012).

  16. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  17. Probabilistic tsunami hazard analysis: Multiple sources and global applications

    USGS Publications Warehouse

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-01-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  18. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    NASA Astrophysics Data System (ADS)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  19. Probabilistic finite elements for fracture and fatigue analysis

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.

    1989-01-01

    The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.

  20. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  1. When experience meets language statistics: Individual variability in processing English compound words.

    PubMed

    Falkauskas, Kaitlin; Kuperman, Victor

    2015-11-01

    Statistical patterns of language use demonstrably affect language comprehension and language production. This study set out to determine whether the variable amount of exposure to such patterns leads to individual differences in reading behavior as measured via eye-movements. Previous studies have demonstrated that more proficient readers are less influenced by distributional biases in language (e.g., frequency, predictability, transitional probability) than poor readers. We hypothesized that a probabilistic bias that is characteristic of written but not spoken language would preferentially affect readers with greater exposure to printed materials in general and to the specific pattern engendering the bias. Readers of varying reading experience were presented with sentences including English compound words that can occur in 2 spelling formats with differing probabilities: concatenated (windowsill, used 40% of the time) or spaced (window sill, 60%). Linear mixed effects multiple regression models fitted to the eye-movement measures showed that the probabilistic bias toward the presented spelling had a stronger facilitatory effect on compounds that occurred more frequently (in any spelling) or belonged to larger morphological families, and on readers with higher scores on a test of exposure-to-print. Thus, the amount of support toward the compound's spelling is effectively exploited when reading, but only when the spelling patterns are entrenched in an individual's mental lexicon via overall exposure to print and to compounds with alternating spelling. We argue that research on the interplay of language use and structure is incomplete without proper characterization of how particular individuals, with varying levels of experience and skill, learn these language structures. (c) 2015 APA, all rights reserved).

  2. When experience meets language statistics: Individual variability in processing English compound words

    PubMed Central

    Falkauskas, Kaitlin; Kuperman, Victor

    2015-01-01

    Statistical patterns of language use demonstrably affect language comprehension and language production. This study set out to determine whether the variable amount of exposure to such patterns leads to individual differences in reading behaviour as measured via eye-movements. Previous studies have demonstrated that more proficient readers are less influenced by distributional biases in language (e.g. frequency, predictability, transitional probability) than poor readers. We hypothesized that a probabilistic bias that is characteristic of written but not spoken language would preferentially affect readers with greater exposure to printed materials in general and to the specific pattern engendering the bias. Readers of varying reading experience were presented with sentences including English compound words that can occur in two spelling formats with differing probabilities: concatenated (windowsill, used 40% of the time) or spaced (window sill, 60%). Linear mixed effects multiple regression models fitted to the eye-movement measures showed that the probabilistic bias towards the presented spelling had a stronger facilitatory effect on compounds that occurred more frequently (in any spelling) or belonged to larger morphological families, and on readers with higher scores on a test of exposure-to-print. Thus, the amount of support towards the compound’s spelling is effectively exploited when reading, but only when the spelling patterns are entrenched in an individual’s mental lexicon via overall exposure to print and to compounds with alternating spelling. We argue that research on the interplay of language use and structure is incomplete without proper characterization of how particular individuals, with varying levels of experience and skill, learn these language structures. PMID:26076328

  3. The Extravehicular Suit Impact Load Attenuation Study for Use in Astronaut Bone Fracture Prediction

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Gilkey, Kelly M.; Sulkowski, Christina M.; Samorezov, Sergey; Myers, Jerry G.

    2011-01-01

    The NASA Integrated Medical Model (IMM) assesses the risk, including likelihood and impact of occurrence, of all credible in-flight medical conditions. Fracture of the proximal femur is a traumatic injury that would likely result in loss of mission if it were to happen during spaceflight. The low gravity exposure causes decreases in bone mineral density which heightens the concern. Researchers at the NASA Glenn Research Center have quantified bone fracture probability during spaceflight with a probabilistic model. It was assumed that a pressurized extravehicular activity (EVA) suit would attenuate load during a fall, but no supporting data was available. The suit impact load attenuation study was performed to collect analogous data. METHODS: A pressurized EVA suit analog test bed was used to study how the offset, defined as the gap between the suit and the astronaut s body, impact load magnitude and suit operating pressure affects the attenuation of impact load. The attenuation data was incorporated into the probabilistic model of bone fracture as a function of these factors, replacing a load attenuation value based on commercial hip protectors. RESULTS: Load attenuation was more dependent on offset than on pressurization or load magnitude, especially at small offsets. Load attenuation factors for offsets between 0.1 - 1.5 cm were 0.69 +/- 0.15, 0.49 +/- 0.22 and 0.35 +/- 0.18 for mean impact forces of 4827, 6400 and 8467 N, respectively. Load attenuation factors for offsets of 2.8 - 5.3 cm were 0.93 +/- 0.2, 0.94 +/- 0.1 and 0.84 +/- 0.5, for the same mean impact forces. Reductions were observed in the 95th percentile confidence interval of the bone fracture probability predictions. CONCLUSIONS: The reduction in uncertainty and improved confidence in bone fracture predictions increased the fidelity and credibility of the fracture risk model and its benefit to mission design and operational decisions.

  4. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  5. Changing population dynamics and uneven temperature emergence combine to exacerbate regional exposure to heat extremes under 1.5 °C and 2 °C of warming

    NASA Astrophysics Data System (ADS)

    Harrington, Luke J.; Otto, Friederike E. L.

    2018-03-01

    Understanding how continuing increases in global mean temperature will exacerbate societal exposure to extreme weather events is a question of profound importance. However, determining population exposure to the impacts of heat extremes at 1.5 °C and 2 °C of global mean warming requires not only (1) a robust understanding of the physical climate system response, but also consideration of (2) projected changes to overall population size, as well as (3) changes to where people will live in the future. This analysis introduces a new framework, adapted from studies of probabilistic event attribution, to disentangle the relative importance of regional climate emergence and changing population dynamics in the exposure to future heat extremes across multiple densely populated regions in Southern Asia and Eastern Africa (SAEA). Our results reveal that, when population is kept at 2015 levels, exposure to heat considered severe in the present decade across SAEA will increase by a factor of 4.1 (2.4-9.6) and 15.8 (5.0-135) under a 1.5°- and 2.0°-warmer world, respectively. Furthermore, projected population changes by the end of the century under an SSP1 and SSP2 scenario can further exacerbate these changes by a factor of 1.2 (1.0-1.3) and 1.5 (1.3-1.7), respectively. However, a large fraction of this additional risk increase is not related to absolute increases in population, but instead attributed to changes in which regions exhibit continued population growth into the future. Further, this added impact of population redistribution will be twice as significant after 2.0 °C of warming, relative to stabilisation at 1.5 °C, due to the non-linearity of increases in heat exposure. Irrespective of the population scenario considered, continued African population expansion will place more people in locations where emergent changes to future heat extremes are exceptionally severe.

  6. Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.

  7. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    PubMed

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  8. An overview of engineering concepts and current design algorithms for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Duffy, S. F.; Hu, J.; Hopkins, D. A.

    1995-01-01

    The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.

  9. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  10. Quantification of uncertainties in the performance of smart composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1993-01-01

    A composite wing with spars, bulkheads, and built-in control devices is evaluated using a method for the probabilistic assessment of smart composite structures. Structural responses (such as change in angle of attack, vertical displacements, and stresses in regular plies with traditional materials and in control plies with mixed traditional and actuation materials) are probabilistically assessed to quantify their respective scatter. Probabilistic sensitivity factors are computed to identify those parameters that have a significant influence on a specific structural response. Results show that the uncertainties in the responses of smart composite structures can be quantified. Responses such as structural deformation, ply stresses, frequencies, and buckling loads in the presence of defects can be reliably controlled to satisfy specified design requirements.

  11. Psychics, aliens, or experience? Using the Anomalistic Belief Scale to examine the relationship between type of belief and probabilistic reasoning.

    PubMed

    Prike, Toby; Arnold, Michelle M; Williamson, Paul

    2017-08-01

    A growing body of research has shown people who hold anomalistic (e.g., paranormal) beliefs may differ from nonbelievers in their propensity to make probabilistic reasoning errors. The current study explored the relationship between these beliefs and performance through the development of a new measure of anomalistic belief, called the Anomalistic Belief Scale (ABS). One key feature of the ABS is that it includes a balance of both experiential and theoretical belief items. Another aim of the study was to use the ABS to investigate the relationship between belief and probabilistic reasoning errors on conjunction fallacy tasks. As expected, results showed there was a relationship between anomalistic belief and propensity to commit the conjunction fallacy. Importantly, regression analyses on the factors that make up the ABS showed that the relationship between anomalistic belief and probabilistic reasoning occurred only for beliefs about having experienced anomalistic phenomena, and not for theoretical anomalistic beliefs. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Are engineered nano iron oxide particles safe? an environmental risk assessment by probabilistic exposure, effects and risk modeling.

    PubMed

    Wang, Yan; Deng, Lei; Caballero-Guzman, Alejandro; Nowack, Bernd

    2016-12-01

    Nano iron oxide particles are beneficial to our daily lives through their use in paints, construction materials, biomedical imaging and other industrial fields. However, little is known about the possible risks associated with the current exposure level of engineered nano iron oxides (nano-FeOX) to organisms in the environment. The goal of this study was to predict the release of nano-FeOX to the environment and assess their risks for surface waters in the EU and Switzerland. The material flows of nano-FeOX to technical compartments (waste incineration and waste water treatment plants) and to the environment were calculated with a probabilistic modeling approach. The mean value of the predicted environmental concentrations (PECs) of nano-FeOX in surface waters in the EU for a worst-case scenario (no particle sedimentation) was estimated to be 28 ng/l. Using a probabilistic species sensitivity distribution, the predicted no-effect concentration (PNEC) was determined from ecotoxicological data. The risk characterization ratio, calculated by dividing the PEC by PNEC values, was used to characterize the risks. The mean risk characterization ratio was predicted to be several orders of magnitude smaller than 1 (1.4 × 10 - 4 ). Therefore, this modeling effort indicates that only a very limited risk is posed by the current release level of nano-FeOX to organisms in surface waters. However, a better understanding of the hazards of nano-FeOX to the organisms in other ecosystems (such as sediment) needs to be assessed to determine the overall risk of these particles to the environment.

  13. Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.

    PubMed

    Lau, Jey Han; Clark, Alexander; Lappin, Shalom

    2017-07-01

    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.

  14. New developments in exposure assessment: the impact on the practice of health risk assessment and epidemiological studies.

    PubMed

    Nieuwenhuijsen, Mark; Paustenbach, Dennis; Duarte-Davidson, Raquel

    2006-12-01

    The field of exposure assessment has matured significantly over the past 10-15 years. Dozens of studies have measured the concentrations of numerous chemicals in many media to which humans are exposed. Others have catalogued the various exposure pathways and identified typical values which can be used in the exposure calculations for the general population such as amount of water or soil ingested per day or the percent of a chemical than can pass through the skin. In addition, studies of the duration of exposure for many tasks (e.g. showering, jogging, working in the office) have been conducted which allow for more general descriptions of the likely range of exposures. All of this information, as well as the development of new and better models (e.g. air dispersion or groundwater models), allow for better estimates of exposure. In addition to identifying better exposure factors, and better mathematical models for predicting the aerial distribution of chemicals, the conduct of simulation studies and dose-reconstruction studies can offer extraordinary opportunities for filling in data gaps regarding historical exposures which are critical to improving the power of epidemiology studies. The use of probabilistic techniques such as Monte Carlo analysis and Bayesian statistics have revolutionized the practice of exposure assessment and has greatly enhanced the quality of the risk characterization. Lastly, the field of epidemiology is about to undergo a sea change with respect to the exposure component because each year better environmental and exposure models, statistical techniques and new biological monitoring techniques are being introduced. This paper reviews these techniques and discusses where additional research is likely to pay a significant dividend. Exposure assessment techniques are now available which can significantly improve the quality of epidemiology and health risk assessment studies and vastly improve their usefulness. As more quantitative exposure components can now be incorporated into these studies, they can be better used to identify safe levels of exposure using customary risk assessment methodologies. Examples are drawn from both environmental and occupational studies illustrating how these techniques have been used to better understand exposure to specific chemicals. Some thoughts are also presented on what lessons have been learned about conducting exposure assessment for health risk assessments and epidemiological studies.

  15. A probabilistic QMRA of Salmonella in direct agricultural reuse of treated municipal wastewater.

    PubMed

    Amha, Yamrot M; Kumaraswamy, Rajkumari; Ahmad, Farrukh

    2015-01-01

    Developing reliable quantitative microbial risk assessment (QMRA) procedures aids in setting recommendations on reuse applications of treated wastewater. In this study, a probabilistic QMRA to determine the risk of Salmonella infections resulting from the consumption of edible crops irrigated with treated wastewater was conducted. Quantitative polymerase chain reaction (qPCR) was used to enumerate Salmonella spp. in post-disinfected samples, where they showed concentrations ranging from 90 to 1,600 cells/100 mL. The results were used to construct probabilistic exposure models for the raw consumption of three vegetables (lettuce, cabbage, and cucumber) irrigated with treated wastewater, and to estimate the disease burden using Monte Carlo analysis. The results showed elevated median disease burden, when compared with acceptable disease burden set by the World Health Organization, which is 10⁻⁶ disability-adjusted life years per person per year. Of the three vegetables considered, lettuce showed the highest risk of infection in all scenarios considered, while cucumber showed the lowest risk. The results of the Salmonella concentration obtained with qPCR were compared with the results of Escherichia coli concentration for samples taken on the same sampling dates.

  16. Affective and cognitive factors influencing sensitivity to probabilistic information.

    PubMed

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  17. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    NASA Astrophysics Data System (ADS)

    Clerc, F.; Njiki-Menga, G.-H.; Witschger, O.

    2013-04-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a quantitative estimation of the airborne particles released at the source when the task is performed. Beyond obtained results, this exploratory study indicates that the analysis of the results requires specific experience in statistics.

  18. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering Held at Vicksburg, Mississippi on 21 September 1982.

    DTIC Science & Technology

    1983-09-01

    al. (1981) was conducted on Copper City No. 2 tailings embankment damn near Miami, Arizona . Due to the extreme topographic relief in the area of the...mode of behavior and scale. ThiL dependency is summarized in the factor R. For example, circular shear instability as in a copper porphyry slope...OF THE PROBABILISTIC SLOPE STABILITY MODEL. . 32 6.1 DESCRIPTION OF COPPER CITY NUMBER 2 TAILINGS DAM . . 32 6.2 SUBSURFACE INVESTIGATION

  19. Fracture mechanics analysis of cracked structures using weight function and neural network method

    NASA Astrophysics Data System (ADS)

    Chen, J. G.; Zang, F. G.; Yang, Y.; Shi, K. K.; Fu, X. L.

    2018-06-01

    Stress intensity factors(SIFs) due to thermal-mechanical load has been established by using weight function method. Two reference stress states sere used to determine the coefficients in the weight function. Results were evaluated by using data from literature and show a good agreement between them. So, the SIFs can be determined quickly using the weight function obtained when cracks subjected to arbitrary loads, and presented method can be used for probabilistic fracture mechanics analysis. A probabilistic methodology considering Monte-Carlo with neural network (MCNN) has been developed. The results indicate that an accurate probabilistic characteristic of the KI can be obtained by using the developed method. The probability of failure increases with the increasing of loads, and the relationship between is nonlinear.

  20. An investigation into the probabilistic combination of quasi-static and random accelerations

    NASA Technical Reports Server (NTRS)

    Schock, R. W.; Tuell, L. P.

    1984-01-01

    The development of design load factors for aerospace and aircraft components and experiment support structures, which are subject to a simultaneous vehicle dynamic vibration (quasi-static) and acoustically generated random vibration, require the selection of a combination methodology. Typically, the procedure is to define the quasi-static and the random generated response separately, and arithmetically add or root sum square to get combined accelerations. Since the combination of a probabilistic and a deterministic function yield a probabilistic function, a viable alternate approach would be to determine the characteristics of the combined acceleration probability density function and select an appropriate percentile level for the combined acceleration. The following paper develops this mechanism and provides graphical data to select combined accelerations for most popular percentile levels.

  1. Leaching of plastic additives to marine organisms.

    PubMed

    Koelmans, Albert A; Besseling, Ellen; Foekema, Edwin M

    2014-04-01

    It is often assumed that ingestion of microplastics by aquatic species leads to increased exposure to plastic additives. However, experimental data or model based evidence is lacking. Here we assess the potential of leaching of nonylphenol (NP) and bisphenol A (BPA) in the intestinal tracts of Arenicola marina (lugworm) and Gadus morhua (North Sea cod). We use a biodynamic model that allows calculations of the relative contribution of plastic ingestion to total exposure of aquatic species to chemicals residing in the ingested plastic. Uncertainty in the most crucial parameters is accounted for by probabilistic modeling. Our conservative analysis shows that plastic ingestion by the lugworm yields NP and BPA concentrations that stay below the lower ends of global NP and BPA concentration ranges, and therefore are not likely to constitute a relevant exposure pathway. For cod, plastic ingestion appears to be a negligible pathway for exposure to NP and BPA. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Seismic hazard exposure for the Trans-Alaska Pipeline

    USGS Publications Warehouse

    Cluff, L.S.; Page, R.A.; Slemmons, D.B.; Grouse, C.B.; ,

    2003-01-01

    The discovery of oil on Alaska's North Slope and the construction of a pipeline to transport that oil across Alaska coincided with the National Environmental Policy Act of 1969 and a destructive Southern California earthquake in 1971 to cause stringent stipulations, state-of-the-art investigations, and innovative design for the pipeline. The magnitude 7.9 earthquake on the Denali fault in November 2002 was remarkably consistent with the design earthquake and fault displacement postulated for the Denali crossing of the Trans-Alaska Pipeline route. The pipeline maintained its integrity, and disaster was averted. Recent probabilistic studies to update previous hazard exposure conclusions suggest continuing pipeline integrity.

  3. Validation of an aggregate exposure model for substances in consumer products: a case study of diethyl phthalate in personal care products

    PubMed Central

    Delmaar, Christiaan; Bokkers, Bas; ter Burg, Wouter; Schuur, Gerlienke

    2015-01-01

    As personal care products (PCPs) are used in close contact with a person, they are a major source of consumer exposure to chemical substances contained in these products. The estimation of realistic consumer exposure to substances in PCPs is currently hampered by the lack of appropriate data and methods. To estimate aggregate exposure of consumers to substances contained in PCPs, a person-oriented consumer exposure model has been developed (the Probabilistic Aggregate Consumer Exposure Model, PACEM). The model simulates daily exposure in a population based on product use data collected from a survey among the Dutch population. The model is validated by comparing diethyl phthalate (DEP) dose estimates to dose estimates based on biomonitoring data. It was found that the model's estimates compared well with the estimates based on biomonitoring data. This suggests that the person-oriented PACEM model is a practical tool for assessing realistic aggregate exposures to substances in PCPs. In the future, PACEM will be extended with use pattern data on other product groups. This will allow for assessing aggregate exposure to substances in consumer products across different product groups. PMID:25352161

  4. Symmetric nonnegative matrix factorization: algorithms and applications to probabilistic clustering.

    PubMed

    He, Zhaoshui; Xie, Shengli; Zdunek, Rafal; Zhou, Guoxu; Cichocki, Andrzej

    2011-12-01

    Nonnegative matrix factorization (NMF) is an unsupervised learning method useful in various applications including image processing and semantic analysis of documents. This paper focuses on symmetric NMF (SNMF), which is a special case of NMF decomposition. Three parallel multiplicative update algorithms using level 3 basic linear algebra subprograms directly are developed for this problem. First, by minimizing the Euclidean distance, a multiplicative update algorithm is proposed, and its convergence under mild conditions is proved. Based on it, we further propose another two fast parallel methods: α-SNMF and β -SNMF algorithms. All of them are easy to implement. These algorithms are applied to probabilistic clustering. We demonstrate their effectiveness for facial image clustering, document categorization, and pattern clustering in gene expression.

  5. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  6. Probabilistic Reversal Learning in Schizophrenia: Stability of Deficits and Potential Causal Mechanisms.

    PubMed

    Reddy, Lena Felice; Waltz, James A; Green, Michael F; Wynn, Jonathan K; Horan, William P

    2016-07-01

    Although individuals with schizophrenia show impaired feedback-driven learning on probabilistic reversal learning (PRL) tasks, the specific factors that contribute to these deficits remain unknown. Recent work has suggested several potential causes including neurocognitive impairments, clinical symptoms, and specific types of feedback-related errors. To examine this issue, we administered a PRL task to 126 stable schizophrenia outpatients and 72 matched controls, and patients were retested 4 weeks later. The task involved an initial probabilistic discrimination learning phase and subsequent reversal phases in which subjects had to adjust their responses to sudden shifts in the reinforcement contingencies. Patients showed poorer performance than controls for both the initial discrimination and reversal learning phases of the task, and performance overall showed good test-retest reliability among patients. A subgroup analysis of patients (n = 64) and controls (n = 49) with good initial discrimination learning revealed no between-group differences in reversal learning, indicating that the patients who were able to achieve all of the initial probabilistic discriminations were not impaired in reversal learning. Regarding potential contributors to impaired discrimination learning, several factors were associated with poor PRL, including higher levels of neurocognitive impairment, poor learning from both positive and negative feedback, and higher levels of indiscriminate response shifting. The results suggest that poor PRL performance in schizophrenia can be the product of multiple mechanisms. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  7. Probabilistic and structural reliability analysis of laminated composite structures based on the IPACS code

    NASA Technical Reports Server (NTRS)

    Sobel, Larry; Buttitta, Claudio; Suarez, James

    1993-01-01

    Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.

  8. Discounting of Monetary Rewards that are Both Delayed and Probabilistic: Delay and Probability Combine Multiplicatively, not Additively

    PubMed Central

    Vanderveldt, Ariana; Green, Leonard; Myerson, Joel

    2014-01-01

    The value of an outcome is affected both by the delay until its receipt (delay discounting) and by the likelihood of its receipt (probability discounting). Despite being well-described by the same hyperboloid function, delay and probability discounting involve fundamentally different processes, as revealed, for example, by the differential effects of reward amount. Previous research has focused on the discounting of delayed and probabilistic rewards separately, with little research examining more complex situations in which rewards are both delayed and probabilistic. In two experiments, participants made choices between smaller rewards that were both immediate and certain and larger rewards that were both delayed and probabilistic. Analyses revealed significant interactions between delay and probability factors inconsistent with an additive model. In contrast, a hyperboloid discounting model in which delay and probability were combined multiplicatively provided an excellent fit to the data. These results suggest that the hyperboloid is a good descriptor of decision making in complicated monetary choice situations like those people encounter in everyday life. PMID:24933696

  9. Nonylphenol in pregnant women and their matching fetuses: Placental transfer and potential risks of infants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yu-Fang; Department of Education and Research, Taipei City Hospital, Taipei, Taiwan; Wang, Pei-Wei

    As the predominant environmental biodegradation product of nonylphenol (NP) ethoxylates and with proven estrogenic effects, NP is formed during the alkylation process of phenols. The purposes of this study were (1) to examine maternal and prenatal exposure to NP in Taiwan, (2) to determine the level of placental protection against NP exposure as well as the level of NP in breast milk, and (3) to assess the potential risk for breastfed newborns exposed to NP through the milk. Thirty pairs of maternal and fetal blood samples, placenta, and breast milk during the 1st and the 3rd months of lactation weremore » collected. External NP exposures of these specimens were then analyzed by using high-performance liquid chromatography coupling with fluorescence detection. Next, the socio-demographics, lifestyle, delivery method, dietary and work history were collected using a questionnaire. In addition, the daily intake of NP from consuming breast milk in the 1st and 3rd months for newborns was studied through deterministic and probabilistic risk assessment methods. The geometric means and geometric standard deviation of NP levels in maternal blood, fetal cord blood, placenta, and breast milk in the 1st and 3rd months were 14.6 (1.7) ng/ml, 18.8 (1.8) ng/ml, 19.8 (1.9) ng/g, 23.5 (3.2) ng/ml, and 57.3 (1.4) ng/ml, respectively. The probabilistic percentiles (50th, 75th, and 95th) of daily intake NP in breast milk were 4.33, 7.79, and 18.39 μg/kg-bw/day in the 1st month, respectively, and were 8.11, 10.78, 16.08 μg/kg-bw/day in the 3rd month, respectively. The probabilistic distributions (5th, 25th, and 50th) of risk for infants aged 1 month old were 0.27, 0.64, and 1.15, respectively, and that for infants aged 3 month old were 0.31, 0.46, and 0.62, respectively. Through repeated exposure from the dietary intake of expectant mothers, fetuses could encounter a high NP exposure level due to transplacental absorption, partitioning between the maternal and fetal compartments. Daily NP intake via breast milk in three month-old babies exceeded the tolerable daily intake (TDI) of 5 µg/kg bw/day indicated a potential risk for Taiwan infants. - Highlights: • A cohort of pregnant women was established and followed until delivery. • The pregnant and lactating mothers and their infants were exposed to NP. • Fetuses in Taiwan showed high NP milk level. • Daily NP intake via breast milk indicated a potential risk for Taiwan infants.« less

  10. Participatory probabilistic assessment of the risk to human health associated with cryptosporidiosis from urban dairying in Dagoretti, Nairobi, Kenya.

    PubMed

    Grace, Delia; Monda, Joseph; Karanja, Nancy; Randolph, Thomas F; Kang'ethe, Erastus K

    2012-09-01

    We carried out a participatory risk assessment to estimate the risk (negative consequences and their likelihood) from zoonotic Cryptosporidium originating in dairy farms in urban Dagoretti, Nairobi to dairy farm households and their neighbours. We selected 20 households at high risk for Cryptosporidium from a larger sample of 300 dairy households in Dagoretti based on risk factors present. We then conducted a participatory mapping of the flow of the hazard from its origin (cattle) to human potential victims. This showed three main exposure pathways (food and water borne, occupational and recreational). This was used to develop a fault tree model which we parameterised using information from the study and literature. A stochastic simulation was used to estimate the probability of exposure to zoonotic cryptosporidiosis originating from urban dairying. Around 6 % of environmental samples were positive for Cryptosporidium. Probability of exposure to Cryptosporidium from dairy cattle ranged from 0.0055 for people with clinical acquired immunodeficiency syndrome in non-dairy households to 0.0102 for children under 5 years from dairy households. Most of the estimated health burden was born by children. Although dairy cattle are the source of Cryptosporidium, the model suggests consumption of vegetables is a greater source of risk than consumption of milk. In conclusion, by combining participatory methods with quantitative microbial risk assessment, we were able to rapidly, and with appropriate 'imprecision', investigate health risk to communities from Cryptosporidium and identify the most vulnerable groups and the most risky practices.

  11. Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions

    PubMed Central

    2017-01-01

    A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy. PMID:29209469

  12. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  13. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  14. Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions.

    PubMed

    Nantha, Yogarabindranath Swarna

    2017-11-01

    A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.

  15. Influences of geological parameters to probabilistic assessment of slope stability of embankment

    NASA Astrophysics Data System (ADS)

    Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr

    2018-04-01

    This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.

  16. He nui na ala e hiki aku ai: Factors Influencing Phonetic Variation in the Hawaiian Word "keia"

    ERIC Educational Resources Information Center

    Drager, Katie; Comstock, Bethany Kaleialohapau'ole Chun; Kneubuhl, Hina Puamohala

    2017-01-01

    Apart from a handful of studies (e.g., Kinney 1956), linguists know little about what variation exists in Hawaiian and what factors constrain the variation. In this paper, we present an analysis of phonetic variation in the word "keia," meaning "this," examining the social, linguistic, and probabilistic factors that constrain…

  17. Probability versus representativeness in infancy: can infants use naïve physics to adjust population base rates in probabilistic inference?

    PubMed

    Denison, Stephanie; Trikutam, Pallavi; Xu, Fei

    2014-08-01

    A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar contexts with different outcomes. Can infants rapidly acquire probabilistic physical knowledge, such as some leaves fall and some glasses break by simply observing the statistical regularity with which objects behave and apply that knowledge in subsequent reasoning? We taught 11-month-old infants physical constraints on objects and asked them to reason about the probability of different outcomes when objects were drawn from a large distribution. Infants could have reasoned either by using the perceptual similarity between the samples and larger distributions or by applying physical rules to adjust base rates and estimate the probabilities. Infants learned the physical constraints quickly and used them to estimate probabilities, rather than relying on similarity, a version of the representativeness heuristic. These results indicate that infants can rapidly and flexibly acquire physical knowledge about objects following very brief exposure and apply it in subsequent reasoning. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  18. Technical report. The application of probability-generating functions to linear-quadratic radiation survival curves.

    PubMed

    Kendal, W S

    2000-04-01

    To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.

  19. Risk assessment of population inhalation exposure to volatile organic compounds and carbonyls in urban China.

    PubMed

    Du, Zhengjian; Mo, Jinhan; Zhang, Yinping

    2014-12-01

    Over the past three decades, China has experienced rapid urbanization. The risks to its urban population posed by inhalation exposure to hazardous air pollutants (HAPs) have not been well characterized. Here, we summarize recent measurements of 16 highly prevalent HAPs in urban China and compile their distribution inputs. Based on activity patterns of urban Chinese working adults, we derive personal exposures. Using a probabilistic risk assessment method, we determine cancer and non-cancer risks for working females and males. We also assess the uncertainty associated with risk estimates using Monte Carlo simulation, accounting for variations in HAP concentrations, cancer potency factors (CPFs) and inhalation rates. Average total lifetime cancer risks attributable to HAPs are 2.27×10(-4) (2.27 additional cases per 10,000 people exposed) and 2.93×10(-4) for Chinese urban working females and males, respectively. Formaldehyde, 1,4-dichlorobenzene, benzene and 1,3-butadiene are the major risk contributors yielding the highest median cancer risk estimates, >1×10(-5). About 70% of the risk is due to exposures occurring in homes. Outdoor sources contribute most to the risk of benzene, ethylbenzene and carbon tetrachloride, while indoor sources dominate for all other compounds. Chronic exposure limits are not exceeded for non-carcinogenic effects, except for formaldehyde. Risks are overestimated if variation is not accounted for. Sensitivity analyses demonstrate that the major contributors to total variance are range of inhalation rates, CPFs of formaldehyde, 1,4-dichlorobenzene, benzene and 1,3-butadiene, and indoor home concentrations of formaldehyde and benzene. Despite uncertainty, risks exceeding the acceptable benchmark of 1×10(-6) suggest actions to reduce exposures. Future efforts should be directed toward large-scale measurements of air pollutant concentrations, refinement of CPFs and investigation of population exposure parameters. The present study is a first effort to estimate carcinogenic and non-carcinogenic risks of inhalation exposure to HAPs for the large working populations of Chinese cites. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Quantification of Campylobacter jejuni contamination on chicken carcasses in France.

    PubMed

    Duqué, Benjamin; Daviaud, Samuel; Guillou, Sandrine; Haddad, Nabila; Membré, Jeanne-Marie

    2018-04-01

    Highly prevalent in poultry, Campylobacter is a foodborne pathogen which remains the primary cause of enteritis in humans. Several studies have determined prevalence and contamination level of this pathogen throughout the food chain. However it is generally performed in a deterministic way without considering heterogeneity of contamination level. The purpose of this study was to quantify, using probabilistic tools, the contamination level of Campylobacter spp. on chicken carcasses after air-chilling step in several slaughterhouses in France. From a dataset (530 data) containing censored data (concentration <10CFU/g), several factors were considered, including the month of sampling, the farming method (standard vs certified) and the sampling area (neck vs leg). All probabilistic analyses were performed in R using fitdistrplus, mc2d and nada packages. The uncertainty (i.e. error) generated by the presence of censored data was small (ca 1 log 10 ) in comparison to the variability (i.e. heterogeneity) of contamination level (3 log 10 or more), strengthening the probabilistic analysis and facilitating result interpretation. The sampling period and sampling area (neck/leg) had a significant effect on Campylobacter contamination level. More precisely, two "seasons" were distinguished: one from January to May, another one from June to December. During the June-to-December season, the mean Campylobacter concentration was estimated to 2.6 [2.4; 2.8] log 10 (CFU/g) and 1.8 [1.5; 2.0] log 10 (CFU/g) for neck and leg, respectively. The probability of having >1000CFU/g (higher limit of European microbial criterion) was estimated to 35.3% and 12.6%, for neck and leg, respectively. In contrast, during January-to-May season, the mean contamination level was estimated to 1.0 [0.6; 1.3] log 10 (CFU/g) and 0.6 [0.3; 0.9] log 10 (CFU/g) for neck and leg, respectively. The probability of having >1000CFU/g was estimated to 13.5% and 2.0% for neck and leg, respectively. An accurate quantification of contamination level enables industrials to better adapt their processing and hygiene practices. These results will also help in refining exposure assessment models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Mechanical failure probability of glasses in Earth orbit

    NASA Technical Reports Server (NTRS)

    Kinser, Donald L.; Wiedlocher, David E.

    1992-01-01

    Results of five years of earth-orbital exposure on mechanical properties of glasses indicate that radiation effects on mechanical properties of glasses, for the glasses examined, are less than the probable error of measurement. During the 5 year exposure, seven micrometeorite or space debris impacts occurred on the samples examined. These impacts were located in locations which were not subjected to effective mechanical testing, hence limited information on their influence upon mechanical strength was obtained. Combination of these results with micrometeorite and space debris impact frequency obtained by other experiments permits estimates of the failure probability of glasses exposed to mechanical loading under earth-orbit conditions. This probabilistic failure prediction is described and illustrated with examples.

  2. Probability of growth of small damage sites on the exit surface of fused silica optics.

    PubMed

    Negres, Raluca A; Abdulla, Ghaleb M; Cross, David A; Liao, Zhi M; Carr, Christopher W

    2012-06-04

    Growth of laser damage on fused silica optical components depends on several key parameters including laser fluence, wavelength, pulse duration, and site size. Here we investigate the growth behavior of small damage sites on the exit surface of SiO₂ optics under exposure to tightly controlled laser pulses. Results demonstrate that the onset of damage growth is not governed by a threshold, but is probabilistic in nature and depends both on the current size of a damage site and the laser fluence to which it is exposed. We also develop models for use in growth prediction. In addition, we show that laser exposure history also influences the behavior of individual sites.

  3. [D.lgsl. 625/1994--Protection against carcinogenic agents: the obligation to educate].

    PubMed

    Fucci, P; Anselmi, E; Bracci, C; Comba, P

    1997-01-01

    According to act 626/1994, employers have the duty to inform and train workers and their representatives. The implementation of training activities requires the following points: planning the training program according to the needs of the target population, use of the methods aimed at promoting learning and the adoption of safe behaviour, setting-up of evaluation tools. The disciplines of risk perception and communication and adult training may provide useful contribution in this frame. At the light of the preliminary experiences in this field, the importance of the following items for workers, workers representatives and employers is emphasized: probabilistic causality models, role of cognitive and emotional factors in the learning process, definition of carcinogenic according to national and international organisation, meaning of TLV with respect to carcinogenic exposure, interaction between carcinogens in the case of multiple exposition, risk evaluation, preventive measures, transfer of carcinogen risk from workplace to domestic environment, due to lack of compliance with basic hygienic rules such proper use of work clothes.

  4. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Astrophysics Data System (ADS)

    Godines, Cody R.; Manteufel, Randall D.

    2002-12-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.

  5. Probabilistic Analysis and Density Parameter Estimation Within Nessus

    NASA Technical Reports Server (NTRS)

    Godines, Cody R.; Manteufel, Randall D.; Chamis, Christos C. (Technical Monitor)

    2002-01-01

    This NASA educational grant has the goal of promoting probabilistic analysis methods to undergraduate and graduate UTSA engineering students. Two undergraduate-level and one graduate-level course were offered at UTSA providing a large number of students exposure to and experience in probabilistic techniques. The grant provided two research engineers from Southwest Research Institute the opportunity to teach these courses at UTSA, thereby exposing a large number of students to practical applications of probabilistic methods and state-of-the-art computational methods. In classroom activities, students were introduced to the NESSUS computer program, which embodies many algorithms in probabilistic simulation and reliability analysis. Because the NESSUS program is used at UTSA in both student research projects and selected courses, a student version of a NESSUS manual has been revised and improved, with additional example problems being added to expand the scope of the example application problems. This report documents two research accomplishments in the integration of a new sampling algorithm into NESSUS and in the testing of the new algorithm. The new Latin Hypercube Sampling (LHS) subroutines use the latest NESSUS input file format and specific files for writing output. The LHS subroutines are called out early in the program so that no unnecessary calculations are performed. Proper correlation between sets of multidimensional coordinates can be obtained by using NESSUS' LHS capabilities. Finally, two types of correlation are written to the appropriate output file. The program enhancement was tested by repeatedly estimating the mean, standard deviation, and 99th percentile of four different responses using Monte Carlo (MC) and LHS. These test cases, put forth by the Society of Automotive Engineers, are used to compare probabilistic methods. For all test cases, it is shown that LHS has a lower estimation error than MC when used to estimate the mean, standard deviation, and 99th percentile of the four responses at the 50 percent confidence level and using the same number of response evaluations for each method. In addition, LHS requires fewer calculations than MC in order to be 99.7 percent confident that a single mean, standard deviation, or 99th percentile estimate will be within at most 3 percent of the true value of the each parameter. Again, this is shown for all of the test cases studied. For that reason it can be said that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ; furthermore, the newest LHS module is a valuable new enhancement of the program.

  6. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.

  7. On the Accuracy of Probabilistic Bucking Load Prediction

    NASA Technical Reports Server (NTRS)

    Arbocz, Johann; Starnes, James H.; Nemeth, Michael P.

    2001-01-01

    The buckling strength of thin-walled stiffened or unstiffened, metallic or composite shells is of major concern in aeronautical and space applications. The difficulty to predict the behavior of axially compressed thin-walled cylindrical shells continues to worry design engineers as we enter the third millennium. Thanks to extensive research programs in the late sixties and early seventies and the contributions of many eminent scientists, it is known that buckling strength calculations are affected by the uncertainties in the definition of the parameters of the problem such as definition of loads, material properties, geometric variables, edge support conditions, and the accuracy of the engineering models and analysis tools used in the design phase. The NASA design criteria monographs from the late sixties account for these design uncertainties by the use of a lump sum safety factor. This so-called 'empirical knockdown factor gamma' usually results in overly conservative design. Recently new reliability based probabilistic design procedure for buckling critical imperfect shells have been proposed. It essentially consists of a stochastic approach which introduces an improved 'scientific knockdown factor lambda(sub a)', that is not as conservative as the traditional empirical one. In order to incorporate probabilistic methods into a High Fidelity Analysis Approach one must be able to assess the accuracy of the various steps that must be executed to complete a reliability calculation. In the present paper the effect of size of the experimental input sample on the predicted value of the scientific knockdown factor lambda(sub a) calculated by the First-Order, Second-Moment Method is investigated.

  8. Assessing the empirical validity of the "take-the-best" heuristic as a model of human probabilistic inference.

    PubMed

    Bröder, A

    2000-09-01

    The boundedly rational 'Take-The-Best" heuristic (TTB) was proposed by G. Gigerenzer, U. Hoffrage, and H. Kleinbölting (1991) as a model of fast and frugal probabilistic inferences. Although the simple lexicographic rule proved to be successful in computer simulations, direct empirical demonstrations of its adequacy as a psychological model are lacking because of several methodical problems. In 4 experiments with a total of 210 participants, this question was addressed. Whereas Experiment 1 showed that TTB is not valid as a universal hypothesis about probabilistic inferences, up to 28% of participants in Experiment 2 and 53% of participants in Experiment 3 were classified as TTB users. Experiment 4 revealed that investment costs for information seem to be a relevant factor leading participants to switch to a noncompensatory TTB strategy. The observed individual differences in strategy use imply the recommendation of an idiographic approach to decision-making research.

  9. Inherent limitations of probabilistic models for protein-DNA binding specificity

    PubMed Central

    Ruan, Shuxiang

    2017-01-01

    The specificities of transcription factors are most commonly represented with probabilistic models. These models provide a probability for each base occurring at each position within the binding site and the positions are assumed to contribute independently. The model is simple and intuitive and is the basis for many motif discovery algorithms. However, the model also has inherent limitations that prevent it from accurately representing true binding probabilities, especially for the highest affinity sites under conditions of high protein concentration. The limitations are not due to the assumption of independence between positions but rather are caused by the non-linear relationship between binding affinity and binding probability and the fact that independent normalization at each position skews the site probabilities. Generally probabilistic models are reasonably good approximations, but new high-throughput methods allow for biophysical models with increased accuracy that should be used whenever possible. PMID:28686588

  10. Fast, noise-free memory for photon synchronization at room temperature.

    PubMed

    Finkelstein, Ran; Poem, Eilon; Michel, Ohad; Lahad, Ohr; Firstenberg, Ofer

    2018-01-01

    Future quantum photonic networks require coherent optical memories for synchronizing quantum sources and gates of probabilistic nature. We demonstrate a fast ladder memory (FLAME) mapping the optical field onto the superposition between electronic orbitals of rubidium vapor. Using a ladder-level system of orbital transitions with nearly degenerate frequencies simultaneously enables high bandwidth, low noise, and long memory lifetime. We store and retrieve 1.7-ns-long pulses, containing 0.5 photons on average, and observe short-time external efficiency of 25%, memory lifetime (1/ e ) of 86 ns, and below 10 -4 added noise photons. Consequently, coupling this memory to a probabilistic source would enhance the on-demand photon generation probability by a factor of 12, the highest number yet reported for a noise-free, room temperature memory. This paves the way toward the controlled production of large quantum states of light from probabilistic photon sources.

  11. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  12. Quantitative risk assessment of the aggregate dermal exposure to the sensitizing fragrance geraniol in personal care products and household cleaning agents.

    PubMed

    Nijkamp, M M; Bokkers, B G H; Bakker, M I; Ezendam, J; Delmaar, J E

    2015-10-01

    A quantitative risk assessment was performed to establish if consumers are at risk for being dermally sensitized by the fragrance geraniol. Aggregate dermal exposure to geraniol was estimated using the Probabilistic Aggregate Consumer Exposure Model, containing data on the use of personal care products and household cleaning agents. Consumer exposure to geraniol via personal care products appeared to be higher than via household cleaning agents. The hands were the body parts receiving the highest exposure to geraniol. Dermal sensitization studies were assessed to derive the point of departure needed for the estimation of the Acceptable Exposure Level (AEL). Two concentrations were derived, one based on human studies and the other from dose-response analysis of the available murine local lymph node assay data. The aggregate dermal exposure assessment resulted in body part specific median exposures up to 0.041 μg/cm(2) (highest exposure 102 μg/cm(2)) for hands. Comparing the exposure to the lowest AEL (55 μg/cm(2)), shows that a range of 0.02-0.86% of the population may have an aggregated exposure which exceeds the AEL. Furthermore, it is demonstrated that personal care products contribute more to the consumer's geraniol exposure compared to household cleaning agents. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. European consumer exposure to cosmetic products, a framework for conducting population exposure assessments.

    PubMed

    Hall, B; Tozer, S; Safford, B; Coroama, M; Steiling, W; Leneveu-Duchemin, M C; McNamara, C; Gibney, M

    2007-11-01

    Access to reliable exposure data is essential to evaluate the toxicological safety of ingredients in cosmetic products. This study was carried out by European cosmetic manufacturers acting within the trade association Colipa, with the aim to construct a probabilistic European population model of exposure. The study updates, in distribution form, the current exposure data on daily quantities of six cosmetic products. Data were collected using a combination of market information databases and a controlled product use study. In total 44,100 households and 18,057 individual consumers in five European countries provided data using their own products. All product use occasions were recorded, including those outside of home. The raw data were analysed using Monte Carlo simulation and a European Statistical Population Model of exposure was constructed. A significant finding was an inverse correlation between frequency of product use and quantity used per application for body lotion, facial moisturiser, toothpaste and shampoo. Thus it is not appropriate to calculate daily exposure to these products by multiplying the maximum frequency value by the maximum quantity per event value. The results largely confirm the exposure parameters currently used by the cosmetic industry. Design of this study could serve as a model for future assessments of population exposure to chemicals in products other than cosmetics.

  14. Shear-wave velocity-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    USGS Publications Warehouse

    Kayen, R.; Moss, R.E.S.; Thompson, E.M.; Seed, R.B.; Cetin, K.O.; Der Kiureghian, A.; Tanaka, Y.; Tokimatsu, K.

    2013-01-01

    Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.

  15. Nine steps to risk-informed wellhead protection and management: Methods and application to the Burgberg Catchment

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Enzenhoefer, R.; Bunk, T.

    2013-12-01

    Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.

  16. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.

  17. Probabilistic Model for Laser Damage to the Human Retina

    DTIC Science & Technology

    2012-03-01

    the beam. Power density may be measured in radiant exposure, J cm2 , or by irradiance , W cm2 . In the experimental database used in this study and...to quan- tify a binary response, either lethal or non-lethal, within a population such as insects or rats. In directed energy research, probit...value of the normalized Arrhenius damage integral. In a one-dimensional simulation, the source term is determined as a spatially averaged irradiance (W

  18. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  19. Estimation of the dietary acrylamide exposure of the Polish population.

    PubMed

    Mojska, Hanna; Gielecińska, Iwona; Szponar, Lucjan; Ołtarzewski, Maciej

    2010-01-01

    The objective of our study was to determine acrylamide content in the Polish foods and to assess the average dietary acrylamide exposure of the Polish population. We analysed the acrylamide content in Polish food using GCQ-MS/MS method. The daily dietary acrylamide exposure was computed using a probabilistic approach for the total Polish population (1-96 years) and for the following age groups: 1-6, 7-18 and 19-96, using Monte Carlo simulation technique. To assess the Polish population exposure to acrylamide present in food, food consumption data was taken from the 'Household Food Consumption and Anthropometric Survey in Poland'. The mean content of acrylamide in tested 225 samples of foodstuffs taken randomly all over Poland, ranged widely from 11 to 3647 microg/kg of product. For the total Polish population (1-96 years) the estimated acrylamide mean exposure is 0.43 microg/kg of body weight per day. The main sources of dietary acrylamide in Polish population were as follow: bread--supplied 45% of total dietary acrylamide intake, French fries and potato crisps--23%, roasted coffee--19%. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  20. Risk assessment considerations with regard to the potential impacts of pesticides on endangered species.

    PubMed

    Brain, Richard A; Teed, R Scott; Bang, JiSu; Thorbek, Pernille; Perine, Jeff; Peranginangin, Natalia; Kim, Myoungwoo; Valenti, Ted; Chen, Wenlin; Breton, Roger L; Rodney, Sara I; Moore, Dwayne R J

    2015-01-01

    Simple, deterministic screening-level assessments that are highly conservative by design facilitate a rapid initial screening to determine whether a pesticide active ingredient has the potential to adversely affect threatened or endangered species. If a worst-case estimate of pesticide exposure is below a very conservative effects metric (e.g., the no observed effects concentration of the most sensitive tested surrogate species) then the potential risks are considered de minimis and unlikely to jeopardize the existence of a threatened or endangered species. Thus by design, such compounded layers of conservatism are intended to minimize potential Type II errors (failure to reject a false null hypothesis of de minimus risk), but correspondingly increase Type I errors (falsely reject a null hypothesis of de minimus risk). Because of the conservatism inherent in screening-level risk assessments, higher-tier scientific information and analyses that provide additional environmental realism can be applied in cases where a potential risk has been identified. This information includes community-level effects data, environmental fate and exposure data, monitoring data, geospatial location and proximity data, species biology data, and probabilistic exposure and population models. Given that the definition of "risk" includes likelihood and magnitude of effect, higher-tier risk assessments should use probabilistic techniques that more accurately and realistically characterize risk. Moreover, where possible and appropriate, risk assessments should focus on effects at the population and community levels of organization rather than the more traditional focus on the organism level. This document provides a review of some types of higher-tier data and assessment refinements available to more accurately and realistically evaluate potential risks of pesticide use to threatened and endangered species. © 2014 SETAC.

  1. Probabilistic risk assessment of Chinese residents' exposure to fluoride in improved drinking water in endemic fluorosis areas.

    PubMed

    Zhang, Li E; Huang, Daizheng; Yang, Jie; Wei, Xiao; Qin, Jian; Ou, Songfeng; Zhang, Zhiyong; Zou, Yunfeng

    2017-03-01

    Studies have yet to evaluate the effects of water improvement on fluoride concentrations in drinking water and the corresponding health risks to Chinese residents in endemic fluorosis areas (EFAs) at a national level. This paper summarized available data in the published literature (2008-2016) on water fluoride from the EFAs in China before and after water quality was improved. Based on these obtained data, health risk assessment of Chinese residents' exposure to fluoride in improved drinking water was performed by means of a probabilistic approach. The uncertainties in the risk estimates were quantified using Monte Carlo simulation and sensitivity analysis. Our results showed that in general, the average fluoride levels (0.10-2.24 mg/L) in the improved drinking water in the EFAs of China were lower than the pre-intervention levels (0.30-15.24 mg/L). The highest fluoride levels were detected in North and Southwest China. The mean non-carcinogenic risks associated with consumption of the improved drinking water for Chinese residents were mostly accepted (hazard quotient < 1), but the non-carcinogenic risk of children in most of the EFAs at the 95th percentile exceeded the safe level of 1, indicating the potential non-cancer-causing health effects on this fluoride-exposed population. Sensitivity analyses indicated that fluoride concentration in drinking water, ingestion rate of water, and the exposure time in the shower were the most relevant variables in the model, therefore, efforts should focus mainly on the definition of their probability distributions for a more accurate risk assessment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Global scanning assessment of calcium channel blockers in the environment: Review and analysis of occurrence, ecotoxicology and hazards in aquatic systems.

    PubMed

    Saari, Gavin N; Scott, W Casan; Brooks, Bryan W

    2017-12-01

    As an urban water cycle is increasingly realized, aquatic systems are influenced by sewage and wastewater effluent discharges of variable quality. Such urbanization results in exposures of non-target aquatic organisms to medicines and other contaminants. In the present study, we performed a unique global hazard assessment of calcium channel blockers (CCB) in multiple environmental matrices. Effluent and freshwater observations were primarily from North America (62% and 76%, respectively) and Europe (21% and 10%, respectively) with limited-to-no information from rapidly urbanizing regions of developing countries in Asia-Pacific, South America, and Africa. Only 9% and 18% of occurrence data were from influent sewage and marine systems, though developing countries routinely discharge poorly treated wastewater to heavily populated coastal regions. Probabilistic environmental exposure distribution (EED) 5th and 95th percentiles for all CCBs were 1.5 and 309.1 ng/L in influent, 5.0 and 448.7 ng/L for effluent, 1.3 and 202.3 ng/L in freshwater, and 0.17 and 12.9 ng/L in saltwater, respectively. Unfortunately, global hazards and risks of CCBs to non-target organisms remain poorly understood, particularly for sublethal exposures. Thus, therapeutic hazard values (THV) were calculated and employed during probabilistic hazard assessments with EEDs when sufficient data was available. Amlodipine and verapamil in effluents and freshwater systems exceeded THVs 28% of the time, highlighting the need to understand ecological consequences of these CCBs. This global scanning approach demonstrated the utility of global assessments to identify specific CCBs, chemical mixtures with common mechanisms of action, and geographic locations for which environmental assessment efforts appear warranted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Toxicokinetics/toxicodynamics of arsenic for farmed juvenile milkfish Chanos chanos and human consumption risk in BFD-endemic area of Taiwan.

    PubMed

    Chou, Berry Yun-Hua; Liao, Chung-Min; Lin, Ming-Chao; Cheng, Hsu-Hui

    2006-05-01

    This paper presents a toxicokinetic/toxicodynamic analysis to appraise arsenic (As) bioaccumulation in farmed juvenile milkfish Chanos chanos at blackfoot disease (BFD)-endemic area in Taiwan, whereas probabilistic incremental lifetime cancer risk (ILCR) and hazard quotient (HQ) models are also employed to assess the range of exposures for the fishers and non-fishers who eat the contaminated fish. We conducted a 7-day exposure experiment to obtain toxicokinetic parameters, whereas a simple critical body burden toxicity model was verified with LC50(t) data obtained from a 7-day acute toxicity bioassay. Acute toxicity bioassay indicates that 96-h LC50 for juvenile milkfish exposed to As is 7.29 (95% CI: 3.10-10.47) mg l(-1). Our risk analysis for milkfish reared in BFD-endemic area indicates a low likelihood that survival is being affected by waterborne As. Human risk analysis demonstrates that 90%-tile probability exposure ILCRs for fishers in BFD-endemic area have orders of magnitude of 10(-3), indicating a high potential carcinogenic risk, whereas there is no significant cancer risk for non-fishers (ILCRs around 10(-5)). All predicted 90%-tiles of HQ are less than 1 for non-fishers, yet larger than 10 for fishers which indicate larger contributions from farmed milkfish consumptions. Sensitivity analysis indicates that to increase the accuracy of the results, efforts should focus on a better definition of probability distributions for milkfish daily consumption rate and As level in milkfish. Here we show that theoretical human health risks for consuming As-contaminated milkfish in the BFD-endemic area are alarming under a conservative condition based on a probabilistic risk assessment model.

  4. A long journey from minimum inhibitory concentration testing to clinically predictive breakpoints: deterministic and probabilistic approaches in deriving breakpoints.

    PubMed

    Dalhoff, A; Ambrose, P G; Mouton, J W

    2009-08-01

    Since the origin of an "'International Collaborative Study on Antibiotic Sensitivity Testing'" in 1971, considerable advancement has been made to standardize clinical susceptibility testing procedures of antimicrobial agents. However, a consensus on the methods to be used and interpretive criteria was not reached, so the results of susceptibility testing were discrepant. Recently, the European Committee on Antimicrobial Susceptibility Testing achieved a harmonization of existing methods for susceptibility testing and now co-ordinates the process for setting breakpoints. Previously, breakpoints were set by adjusting the mean pharmacokinetic parameters derived from healthy volunteers to the susceptibilities of a population of potential pathogens expressed as the mean minimum inhibitory concentration (MIC) or MIC90%. Breakpoints derived by the deterministic approach tend to be too high, since this procedure does not take the variabilities of drug exposure and the susceptibility patterns into account. Therefore, first-step mutants or borderline susceptible bacteria may be considered as fully susceptible. As the drug exposure of such sub-populations is inadequate, resistance development will increase and eradication rates will decrease, resulting in clinical failure. The science of pharmacokinetics/pharmacodynamics integrates all possible drug exposures for standard dosage regimens and all MIC values likely to be found for the clinical isolates into the breakpoint definitions. Ideally, the data sets used originate from patients suffering from the disease to be treated. Probability density functions for both the pharmacokinetic and microbiological variables are determined, and a large number of MIC/drug exposure scenarios are calculated. Therefore, this method is defined as the probabilistic approach. The breakpoints thus derived are lower than the ones defined deterministically, as the entire range of probable drug exposures from low to high is modeled. Therefore, the amplification of drug-resistant sub-populations will be reduced. It has been a long journey since the first attempts in 1971 to define breakpoints. Clearly, this implies that none of the various approaches is right or wrong, and that the different approaches reflect different philosophies and mirror the tremendous progress made in the understanding of the pharmacodynamic properties of different classes of antimicrobials.

  5. Incorporating High-Throughput Exposure Predictions with ...

    EPA Pesticide Factsheets

    We previously integrated dosimetry and exposure with high-throughput screening (HTS) to enhance the utility of ToxCast™ HTS data by translating in vitro bioactivity concentrations to oral equivalent doses (OEDs) required to achieve these levels internally. These OEDs were compared against regulatory exposure estimates, providing an activity-to-exposure ratio (AER) useful for a risk-based ranking strategy. As ToxCast™ efforts expand (i.e., Phase II) beyond food-use pesticides towards a wider chemical domain that lacks exposure and toxicity information, prediction tools become increasingly important. In this study, in vitro hepatic clearance and plasma protein binding were measured to estimate OEDs for a subset of Phase II chemicals. OEDs were compared against high-throughput (HT) exposure predictions generated using probabilistic modeling and Bayesian approaches generated by the U.S. EPA ExpoCast™ program. This approach incorporated chemical-specific use and national production volume data with biomonitoring data to inform the exposure predictions. This HT exposure modeling approach provided predictions for all Phase II chemicals assessed in this study whereas estimates from regulatory sources were available for only 7% of chemicals. Of the 163 chemicals assessed in this study, three or 13 chemicals possessed AERs <1 or <100, respectively. Diverse bioactivities y across a range of assays and concentrations was also noted across the wider chemical space su

  6. Improvements in Modelling Bystander and Resident Exposure to Pesticide Spray Drift: Investigations into New Approaches for Characterizing the 'Collection Efficiency' of the Human Body.

    PubMed

    Butler Ellis, M Clare; Kennedy, Marc C; Kuster, Christian J; Alanis, Rafael; Tuck, Clive R

    2018-05-28

    The BREAM (Bystander and Resident Exposure Assessment Model) (Kennedy et al. in BREAM: A probabilistic bystander and resident exposure assessment model of spray drift from an agricultural boom sprayer. Comput Electron Agric 2012;88:63-71) for bystander and resident exposure to spray drift from boom sprayers has recently been incorporated into the European Food Safety Authority (EFSA) guidance for determining non-dietary exposures of humans to plant protection products. The component of BREAM, which relates airborne spray concentrations to bystander and resident dermal exposure, has been reviewed to identify whether it is possible to improve this and its description of variability captured in the model. Two approaches have been explored: a more rigorous statistical analysis of the empirical data and a semi-mechanistic model based on established studies combined with new data obtained in a wind tunnel. A statistical comparison between field data and model outputs was used to determine which approach gave the better prediction of exposures. The semi-mechanistic approach gave the better prediction of experimental data and resulted in a reduction in the proposed regulatory values for the 75th and 95th percentiles of the exposure distribution.

  7. [Exposure to whole body vibrations in workers moving heavy items by mechanical vehicles in the warehouse of a large retail outlet].

    PubMed

    Siciliano, E; Rossi, A; Nori, L

    2007-01-01

    Efficient warehouse management and item transportation is of fundamental importance in the commercial outlet in exam. Whole body vibrations have been measured in various types of machines, some of which not widely studied yet, like the electrical pallet truck. In some tasks (fork lifts drivers) vibrations propagate through the driving seat whereas in some other tasks (electrical pallet trucks, stackers), operated in a standing posture, vibrations propagate through the lower limbs. Results have been provided for a homogeneous job tasks. In particular conditions, the action level of the Italian national (and European) regulations on occupational exposure to WBV may be exceeded. The authors propose a simple system of probabilistic classification of the risk of exposure to whole body vibrations, based on the respective areas of the distribution which lay within the three risk classes.

  8. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  9. A probabilistic bridge safety evaluation against floods.

    PubMed

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  10. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2011-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  11. A Robust Approach to Risk Assessment Based on Species Sensitivity Distributions.

    PubMed

    Monti, Gianna S; Filzmoser, Peter; Deutsch, Roland C

    2018-05-03

    The guidelines for setting environmental quality standards are increasingly based on probabilistic risk assessment due to a growing general awareness of the need for probabilistic procedures. One of the commonly used tools in probabilistic risk assessment is the species sensitivity distribution (SSD), which represents the proportion of species affected belonging to a biological assemblage as a function of exposure to a specific toxicant. Our focus is on the inverse use of the SSD curve with the aim of estimating the concentration, HCp, of a toxic compound that is hazardous to p% of the biological community under study. Toward this end, we propose the use of robust statistical methods in order to take into account the presence of outliers or apparent skew in the data, which may occur without any ecological basis. A robust approach exploits the full neighborhood of a parametric model, enabling the analyst to account for the typical real-world deviations from ideal models. We examine two classic HCp estimation approaches and consider robust versions of these estimators. In addition, we also use data transformations in conjunction with robust estimation methods in case of heteroscedasticity. Different scenarios using real data sets as well as simulated data are presented in order to illustrate and compare the proposed approaches. These scenarios illustrate that the use of robust estimation methods enhances HCp estimation. © 2018 Society for Risk Analysis.

  12. Probabilistic Simulation for Combined Cycle Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multifactor interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  13. Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.

  14. Probabilistic modeling approach for evaluating the compliance of ready-to-eat foods with new European Union safety criteria for Listeria monocytogenes.

    PubMed

    Koutsoumanis, Konstantinos; Angelidis, Apostolos S

    2007-08-01

    Among the new microbiological criteria that have been incorporated in EU Regulation 2073/2005, of particular interest are those concerning Listeria monocytogenes in ready-to eat (RTE) foods, because for certain food categories, they no longer require zero tolerance but rather specify a maximum allowable concentration of 100 CFU/g or ml. This study presents a probabilistic modeling approach for evaluating the compliance of RTE sliced meat products with the new safety criteria for L. monocytogenes. The approach was based on the combined use of (i) growth/no growth boundary models, (ii) kinetic growth models, (iii) product characteristics data (pH, a(w), shelf life) collected from 160 meat products from the Hellenic retail market, and (iv) storage temperature data recorded from 50 retail stores in Greece. This study shows that probabilistic analysis of the above components using Monte Carlo simulation, which takes into account the variability of factors affecting microbial growth, can lead to a realistic estimation of the behavior of L. monocytogenes throughout the food supply chain, and the quantitative output generated can be further used by food managers as a decision-making tool regarding the design or modification of a product's formulation or its "use-by" date in order to ensure its compliance with the new safety criteria. The study also argues that compliance of RTE foods with the new safety criteria should not be considered a parameter with a discrete and binary outcome because it depends on factors such as product characteristics, storage temperature, and initial contamination level, which display considerable variability even among different packages of the same RTE product. Rather, compliance should be expressed and therefore regulated in a more probabilistic fashion.

  15. Design, development and validation of software for modelling dietary exposure to food chemicals and nutrients.

    PubMed

    McNamara, C; Naddy, B; Rohan, D; Sexton, J

    2003-10-01

    The Monte Carlo computational system for stochastic modelling of dietary exposure to food chemicals and nutrients is presented. This system was developed through a European Commission-funded research project. It is accessible as a Web-based application service. The system allows and supports very significant complexity in the data sets used as the model input, but provides a simple, general purpose, linear kernel for model evaluation. Specific features of the system include the ability to enter (arbitrarily) complex mathematical or probabilistic expressions at each and every input data field, automatic bootstrapping on subjects and on subject food intake diaries, and custom kernels to apply brand information such as market share and loyalty to the calculation of food and chemical intake.

  16. Distribution and health risk assessment of trace metals in freshwater tilapia from three different aquaculture sites in Jelebu Region (Malaysia).

    PubMed

    Low, Kah Hin; Zain, Sharifuddin Md; Abas, Mhd Radzi; Md Salleh, Kaharudin; Teo, Yin Yin

    2015-06-15

    The trace metal concentrations in edible muscle of red tilapia (Oreochromis spp.) sampled from a former tin mining pool, concrete tank and earthen pond in Jelebu were analysed with microwave assisted digestion-inductively coupled plasma-mass spectrometry. Results were compared with established legal limits and the daily ingestion exposures simulated using the Monte Carlo algorithm for potential health risks. Among the metals investigated, arsenic was found to be the key contaminant, which may have arisen from the use of formulated feeding pellets. Although the risks of toxicity associated with consumption of red tilapia from the sites investigated were found to be within the tolerable range, the preliminary probabilistic estimation of As cancer risk shows that the 95th percentile risk level surpassed the benchmark level of 10(-5). In general, the probabilistic health risks associated with ingestion of red tilapia can be ranked as follows: former tin mining pool > concrete tank > earthen pond. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels

    NASA Astrophysics Data System (ADS)

    Kurniasih, R.; Sujadi, I.

    2017-09-01

    This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.

  18. Violence and child mental health in Brazil: The Itaboraí Youth Study methods and findings.

    PubMed

    Bordin, I A; Duarte, C S; Ribeiro, W S; Paula, C S; Coutinho, E S F; Sourander, A; Rønning, J A

    2018-06-01

    To demonstrate a study design that could be useful in low-resource and violent urban settings and to estimate the prevalence of child violence exposure (at home, community, and school) and child mental health problems in a low-income medium-size city. The Itaboraí Youth Study is a Norway-Brazil collaborative longitudinal study conducted in Itaboraí city (n = 1409, 6-15 year olds). A 3-stage probabilistic sampling plan (random selection of census units, eligible households, and target child) generated sampling weights that were used to obtain estimates of population prevalence rates. Study strengths include previous pilot study and focus groups (testing procedures and comprehension of questionnaire items), longitudinal design (2 assessment periods with a mean interval of 12.9 months), high response rate (>80%), use of standardized instruments, different informants (mother and adolescent), face-to-face interviews to avoid errors due to the high frequency of low-educated respondents, and information gathered on a variety of potential predictors and protective factors. Children and adolescents presented relevant levels of violence exposure and clinical mental health problems. Prevalence estimates are probably valid to other Brazilian low-income medium-size cities due to similarities in terms of precarious living conditions. Described study methods could be useful in other poor and violent world regions. Copyright © 2018 John Wiley & Sons, Ltd.

  19. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  20. The cumulative MeHg and PCBs exposure and risk of tribal ...

    EPA Pesticide Factsheets

    Studies have shown that the U.S. population continues to be exposed to methyl mercury (MeHg) and polychlorinated biphenyls (PCBs) due to the long half-life of those environmental contaminants. Fish intake of Tribal populations is much higher than the U.S. general population due to dietary habits and unique cultural practices. Large fish tissue concentration data sets from the Environmental Protections Agency’s (EPA’s) Office of Water, USGS’s EMMMA program, and other data sources, were integrated, analyzed, and combined with recent tribal fish intake data for exposure analyses using the dietary module within EPA’s SHEDS-Multimedia model. SHEDS-Multimedia is a physically-based, probabilistic model, which can simulate cumulative (multiple chemicals) or aggregate (single chemical) exposures over time for a population via various pathways of exposure for a variety of multimedia, multipathway environmental chemicals. Our results show that MeHg and total PCBs exposure of tribal populations from fish are about 3 to 10 and 5 to 15 times higher than the US general population, respectively, and that the estimated exposures pose potential health risks. The cumulative exposures of MeHg and total PCBs will be assessed to generate the joint exposure profiles for Tribal and US general populations. Model sensitivity analyses will identify the important contributions of the cumulative exposures of MeHg and total PCBs such as fish types, locations, and size, and key expos

  1. Virulo

    EPA Science Inventory

    Virulo is a probabilistic model for predicting virus attenuation. Monte Carlo methods are used to generate ensemble simulations of virus attenuation due to physical, biological, and chemical factors. The model generates a probability of failure to achieve a chosen degree o...

  2. Skin sensitisation quantitative risk assessment (QRA) based on aggregate dermal exposure to methylisothiazolinone in personal care and household cleaning products.

    PubMed

    Ezendam, J; Bokkers, B G H; Bil, W; Delmaar, J E

    2018-02-01

    Contact allergy to preservatives is an important public health problem. Ideally, new substances should be evaluated for the risk on skin sensitisation before market entry, for example by using a quantitative risk assessment (QRA) as developed for fragrances. As a proof-of-concept, this QRA was applied to the preservative methylisothiazolinone (MI), a common cause of contact allergy. MI is used in different consumer products, including personal care products (PCPs) and household cleaning products (HCPs). Aggregate exposure to MI in PCPs and HCPs was therefore assessed with the Probabilistic Aggregated Consumer Exposure Model (PACEM). Two exposure scenarios were evaluated: scenario 1 calculated aggregate exposure on actual MI product concentrations before the restricted use in PCPs and scenario 2 calculated aggregate exposure using the restrictions for MI in PCPs. The QRA for MI showed that in scenarios 1 and 2, the proportion of the population at risk for skin sensitisation is 0.7% and 0.5%, respectively. The restricted use of MI in PCPs does not seem very effective in lowering the risk on skin sensitization. To conclude, it is important to consider aggregate exposure from the most important consumer products into consideration in the risk assessment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. A European model and case studies for aggregate exposure assessment of pesticides.

    PubMed

    Kennedy, Marc C; Glass, C Richard; Bokkers, Bas; Hart, Andy D M; Hamey, Paul Y; Kruisselbrink, Johannes W; de Boer, Waldo J; van der Voet, Hilko; Garthwaite, David G; van Klaveren, Jacob D

    2015-05-01

    Exposures to plant protection products (PPPs) are assessed using risk analysis methods to protect public health. Traditionally, single sources, such as food or individual occupational sources, have been addressed. In reality, individuals can be exposed simultaneously to multiple sources. Improved regulation therefore requires the development of new tools for estimating the population distribution of exposures aggregated within an individual. A new aggregate model is described, which allows individual users to include as much, or as little, information as is available or relevant for their particular scenario. Depending on the inputs provided by the user, the outputs can range from simple deterministic values through to probabilistic analyses including characterisations of variability and uncertainty. Exposures can be calculated for multiple compounds, routes and sources of exposure. The aggregate model links to the cumulative dietary exposure model developed in parallel and is implemented in the web-based software tool MCRA. Case studies are presented to illustrate the potential of this model, with inputs drawn from existing European data sources and models. These cover exposures to UK arable spray operators, Italian vineyard spray operators, Netherlands users of a consumer spray and UK bystanders/residents. The model could also be adapted to handle non-PPP compounds. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  4. Probabilistic Causal Analysis for System Safety Risk Assessments in Commercial Air Transport

    NASA Technical Reports Server (NTRS)

    Luxhoj, James T.

    2003-01-01

    Aviation is one of the critical modes of our national transportation system. As such, it is essential that new technologies be continually developed to ensure that a safe mode of transportation becomes even safer in the future. The NASA Aviation Safety Program (AvSP) is managing the development of new technologies and interventions aimed at reducing the fatal aviation accident rate by a factor of 5 by year 2007 and by a factor of 10 by year 2022. A portfolio assessment is currently being conducted to determine the projected impact that the new technologies and/or interventions may have on reducing aviation safety system risk. This paper reports on advanced risk analytics that combine the use of a human error taxonomy, probabilistic Bayesian Belief Networks, and case-based scenarios to assess a relative risk intensity metric. A sample case is used for illustrative purposes.

  5. Robust Depth Image Acquisition Using Modulated Pattern Projection and Probabilistic Graphical Models

    PubMed Central

    Kravanja, Jaka; Žganec, Mario; Žganec-Gros, Jerneja; Dobrišek, Simon; Štruc, Vitomir

    2016-01-01

    Depth image acquisition with structured light approaches in outdoor environments is a challenging problem due to external factors, such as ambient sunlight, which commonly affect the acquisition procedure. This paper presents a novel structured light sensor designed specifically for operation in outdoor environments. The sensor exploits a modulated sequence of structured light projected onto the target scene to counteract environmental factors and estimate a spatial distortion map in a robust manner. The correspondence between the projected pattern and the estimated distortion map is then established using a probabilistic framework based on graphical models. Finally, the depth image of the target scene is reconstructed using a number of reference frames recorded during the calibration process. We evaluate the proposed sensor on experimental data in indoor and outdoor environments and present comparative experiments with other existing methods, as well as commercial sensors. PMID:27775570

  6. Quantifying Uncertainties in the Thermo-Mechanical Properties of Particulate Reinforced Composites

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Murthy, Pappu L. N.

    1999-01-01

    The present paper reports results from a computational simulation of probabilistic particulate reinforced composite behavior. The approach consists use of simplified micromechanics of particulate reinforced composites together with a Fast Probability Integration (FPI) technique. Sample results are presented for a Al/SiC(sub p)(silicon carbide particles in aluminum matrix) composite. The probability density functions for composite moduli, thermal expansion coefficient and thermal conductivities along with their sensitivity factors are computed. The effect of different assumed distributions and the effect of reducing scatter in constituent properties on the thermal expansion coefficient are also evaluated. The variations in the constituent properties that directly effect these composite properties are accounted for by assumed probabilistic distributions. The results show that the present technique provides valuable information about the scatter in composite properties and sensitivity factors, which are useful to test or design engineers.

  7. Probabilistic durability assessment of concrete structures in marine environments: Reliability and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Yu, Bo; Ning, Chao-lie; Li, Bing

    2017-03-01

    A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.

  8. Dopamine neurons learn relative chosen value from probabilistic rewards

    PubMed Central

    Lak, Armin; Stauffer, William R; Schultz, Wolfram

    2016-01-01

    Economic theories posit reward probability as one of the factors defining reward value. Individuals learn the value of cues that predict probabilistic rewards from experienced reward frequencies. Building on the notion that responses of dopamine neurons increase with reward probability and expected value, we asked how dopamine neurons in monkeys acquire this value signal that may represent an economic decision variable. We found in a Pavlovian learning task that reward probability-dependent value signals arose from experienced reward frequencies. We then assessed neuronal response acquisition during choices among probabilistic rewards. Here, dopamine responses became sensitive to the value of both chosen and unchosen options. Both experiments showed also the novelty responses of dopamine neurones that decreased as learning advanced. These results show that dopamine neurons acquire predictive value signals from the frequency of experienced rewards. This flexible and fast signal reflects a specific decision variable and could update neuronal decision mechanisms. DOI: http://dx.doi.org/10.7554/eLife.18044.001 PMID:27787196

  9. Compression of Probabilistic XML Documents

    NASA Astrophysics Data System (ADS)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  10. Poisson-Like Spiking in Circuits with Probabilistic Synapses

    PubMed Central

    Moreno-Bote, Rubén

    2014-01-01

    Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex. PMID:25032705

  11. Optimization of Adaptive Intraply Hybrid Fiber Composites with Reliability Considerations

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1994-01-01

    The reliability with bounded distribution parameters (mean, standard deviation) was maximized and the reliability-based cost was minimized for adaptive intra-ply hybrid fiber composites by using a probabilistic method. The probabilistic method accounts for all naturally occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry, and control-related parameters. Probabilistic sensitivity factors were computed and used in the optimization procedures. For actuated change in the angle of attack of an airfoil-like composite shell structure with an adaptive torque plate, the reliability was maximized to 0.9999 probability, with constraints on the mean and standard deviation of the actuation material volume ratio (percentage of actuation composite material in a ply) and the actuation strain coefficient. The reliability-based cost was minimized for an airfoil-like composite shell structure with an adaptive skin and a mean actuation material volume ratio as the design parameter. At a O.9-mean actuation material volume ratio, the minimum cost was obtained.

  12. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  13. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  14. Modelling of human exposure to air pollution in the urban environment: a GPS-based approach.

    PubMed

    Dias, Daniela; Tchepel, Oxana

    2014-03-01

    The main objective of this work was the development of a new modelling tool for quantification of human exposure to traffic-related air pollution within distinct microenvironments by using a novel approach for trajectory analysis of the individuals. For this purpose, mobile phones with Global Positioning System technology have been used to collect daily trajectories of the individuals with higher temporal resolution and a trajectory data mining, and geo-spatial analysis algorithm was developed and implemented within a Geographical Information System to obtain time-activity patterns. These data were combined with air pollutant concentrations estimated for several microenvironments. In addition to outdoor, pollutant concentrations in distinct indoor microenvironments are characterised using a probabilistic approach. An example of the application for PM2.5 is presented and discussed. The results obtained for daily average individual exposure correspond to a mean value of 10.6 and 6.0-16.4 μg m(-3) in terms of 5th-95th percentiles. Analysis of the results shows that the use of point air quality measurements for exposure assessment will not explain the intra- and inter-variability of individuals' exposure levels. The methodology developed and implemented in this work provides time-sequence of the exposure events thus making possible association of the exposure with the individual activities and delivers main statistics on individual's air pollution exposure with high spatio-temporal resolution.

  15. The physiological kinetics of nitrogen and the prevention of decompression sickness.

    PubMed

    Doolette, D J; Mitchell, S J

    2001-01-01

    Decompression sickness (DCS) is a potentially crippling disease caused by intracorporeal bubble formation during or after decompression from a compressed gas underwater dive. Bubbles most commonly evolve from dissolved inert gas accumulated during the exposure to increased ambient pressure. Most diving is performed breathing air, and the inert gas of interest is nitrogen. Divers use algorithms based on nitrogen kinetic models to plan the duration and degree of exposure to increased ambient pressure and to control their ascent rate. However, even correct execution of dives planned using such algorithms often results in bubble formation and may result in DCS. This reflects the importance of idiosyncratic host factors that are difficult to model, and deficiencies in current nitrogen kinetic models. Models describing the exchange of nitrogen between tissues and blood may be based on distributed capillary units or lumped compartments, either of which may be perfusion- or diffusion-limited. However, such simplistic models are usually poor predictors of experimental nitrogen kinetics at the organ or tissue level, probably because they fail to account for factors such as heterogeneity in both tissue composition and blood perfusion and non-capillary exchange mechanisms. The modelling of safe decompression procedures is further complicated by incomplete understanding of the processes that determine bubble formation. Moreover, any formation of bubbles during decompression alters subsequent nitrogen kinetics. Although these factors mandate complex resolutions to account for the interaction between dissolved nitrogen kinetics and bubble formation and growth, most decompression schedules are based on relatively simple perfusion-limited lumped compartment models of blood: tissue nitrogen exchange. Not surprisingly, all models inevitably require empirical adjustment based on outcomes in the field. Improvements in the predictive power of decompression calculations are being achieved using probabilistic bubble models, but divers will always be subject to the possibility of developing DCS despite adherence to prescribed limits.

  16. Standard penetration test-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    USGS Publications Warehouse

    Cetin, K.O.; Seed, R.B.; Der Kiureghian, A.; Tokimatsu, K.; Harder, L.F.; Kayen, R.E.; Moss, R.E.S.

    2004-01-01

    This paper presents'new correlations for assessment of the likelihood of initiation (or triggering) of soil liquefaction. These new correlations eliminate several sources of bias intrinsic to previous, similar correlations, and provide greatly reduced overall uncertainty and variance. Key elements in the development of these new correlations are (1) accumulation of a significantly expanded database of field performance case histories; (2) use of improved knowledge and understanding of factors affecting interpretation of standard penetration test data; (3) incorporation of improved understanding of factors affecting site-specific earthquake ground motions (including directivity effects, site-specific response, etc.); (4) use of improved methods for assessment of in situ cyclic shear stress ratio; (5) screening of field data case histories on a quality/uncertainty basis; and (6) use of high-order probabilistic tools (Bayesian updating). The resulting relationships not only provide greatly reduced uncertainty, they also help to resolve a number of corollary issues that have long been difficult and controversial including: (1) magnitude-correlated duration weighting factors, (2) adjustments for fines content, and (3) corrections for overburden stress. ?? ASCE.

  17. Integrated presentation of ecological risk from multiple stressors

    NASA Astrophysics Data System (ADS)

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-10-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  18. Integrated presentation of ecological risk from multiple stressors.

    PubMed

    Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman

    2016-10-26

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  19. Probabilistic modeling of the flows and environmental risks of nano-silica.

    PubMed

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Probabilistic health risk assessment for arsenic intake through drinking groundwater in Taiwan's Pingtung Plain

    NASA Astrophysics Data System (ADS)

    Liang, C. P.; Chen, J. S.

    2017-12-01

    An abundant and inexpensive supply of groundwater is used to meet drinking, agriculture and aquaculture requirements of the residents in the Pingtung Plain. Long-term groundwater quality monitoring data indicate that the As content in groundwater in the Pingtung Plain exceeds the maximum level of 10 g/L recommended by the World Health Organization (WHO). The situation is further complicated by the fact that only 46.89% of population in the Pingtung Plain has been served with tap water, far below the national average of 92.93%. Considering there is a considerable variation in the measured concentrations, from below the detection limit (<0.1 g/L) to the maximum value of 544 g/L and the consumption rate and body weight of the individual, the conventional approach to conducting a human health risk assessment may be insufficient for health risk management. This study presents a probabilistic risk assessment for inorganic As intake through the consumption of the drinking groundwater by local residents in the Pingtung Plain. The probabilistic risk assessment for inorganic As intake through the consumption of the drinking groundwater is achieved using Monte Carlo simulation technique based on the hazard quotient (HQ) and target cancer risk (TR) established by the U.S. Environmental Protection Agency. This study demonstrates the importance of the individual variability of inorganic As intake through drinking groundwater consumption when evaluating a high exposure sub-group of the population who drink high As content groundwater.

  1. Development of a probabilistic PCB-bioaccumulation model for six fish species in the Hudson River

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stackelberg, K. von; Menzie, C.

    1995-12-31

    In 1984 the US Environmental Protection Agency (USEPA) completed a Feasibility Study on the Hudson River that investigated remedial alternatives and issued a Record of Decision (ROD) later that year. In December 1989 USEPA decided to reassess the No Action decision for Hudson River sediments. This reassessment consists of three phases: Interim Characterization and Evaluation (Phase 1); Further Site Characterization and Analysis (Phase 2); and, Feasibility study (Phase 3). A Phase 1 report was completed in August, 1991. The team then completed a Final Work Plan for Phase 2 in September 1992. This work plan identified various PCB fate andmore » transport modeling activities to support the Hudson River PCB Reassessment Remedial Investigation and Feasibility Study (RI/FS). This talk provides a description of the development of a Probabilistic bioaccumulation models to describe the uptake of PCBs on a congener-specific basis in six fish species. The authors have developed a framework for relating body burdens of PCBs in fish to exposure concentrations in Hudson River water and sediments. This framework is used to understand historical and current relationships as well as to predict fish body burdens for future conditions under specific remediation and no action scenarios. The framework incorporates a probabilistic approach to predict distributions in PCB body burdens for selected fish species. These models can predict single population statistics such as the average expected values of PCBs under specific scenarios as well as the distribution of expected concentrations.« less

  2. Probabilistic Model for Listeria monocytogenes Growth during Distribution, Retail Storage, and Domestic Storage of Pasteurized Milk ▿

    PubMed Central

    Koutsoumanis, Konstantinos; Pavlis, Athanasios; Nychas, George-John E.; Xanthiakos, Konstantinos

    2010-01-01

    A survey on the time-temperature conditions of pasteurized milk in Greece during transportation to retail, retail storage, and domestic storage and handling was performed. The data derived from the survey were described with appropriate probability distributions and introduced into a growth model of Listeria monocytogenes in pasteurized milk which was appropriately modified for taking into account strain variability. Based on the above components, a probabilistic model was applied to evaluate the growth of L. monocytogenes during the chill chain of pasteurized milk using a Monte Carlo simulation. The model predicted that, in 44.8% of the milk cartons released in the market, the pathogen will grow until the time of consumption. For these products the estimated mean total growth of L. monocytogenes during transportation, retail storage, and domestic storage was 0.93 log CFU, with 95th and 99th percentiles of 2.68 and 4.01 log CFU, respectively. Although based on EU regulation 2073/2005 pasteurized milk produced in Greece belongs to the category of products that do not allow the growth of L. monocytogenes due to a shelf life (defined by law) of 5 days, the above results show that this shelf life limit cannot prevent L. monocytogenes from growing under the current chill chain conditions. The predicted percentage of milk cartons—initially contaminated with 1 cell/1-liter carton—in which the pathogen exceeds the safety criterion of 100 cells/ml at the time of consumption was 0.14%. The probabilistic model was used for an importance analysis of the chill chain factors, using rank order correlation, while selected intervention and shelf life increase scenarios were evaluated. The results showed that simple interventions, such as excluding the door shelf from the domestic storage of pasteurized milk, can effectively reduce the growth of the pathogen. The door shelf was found to be the warmest position in domestic refrigerators, and it was most frequently used by the consumers for domestic storage of pasteurized milk. Furthermore, the model predicted that a combination of this intervention with a decrease of the mean temperature of domestic refrigerators by 2°C may allow an extension of pasteurized milk shelf life from 5 to 7 days without affecting the current consumer exposure to L. monocytogenes. PMID:20139308

  3. Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2003-01-01

    The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.

  4. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  5. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  6. Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; Shah, Ashwin R.

    1997-01-01

    The properties of ceramic matrix composites (CMC's) are known to display a considerable amount of scatter due to variations in fiber/matrix properties, interphase properties, interphase bonding, amount of matrix voids, and many geometry- or fabrication-related parameters, such as ply thickness and ply orientation. This paper summarizes preliminary studies in which formal probabilistic descriptions of the material-behavior- and fabrication-related parameters were incorporated into micromechanics and macromechanics for CMC'S. In this process two existing methodologies, namely CMC micromechanics and macromechanics analysis and a fast probability integration (FPI) technique are synergistically coupled to obtain the probabilistic composite behavior or response. Preliminary results in the form of cumulative probability distributions and information on the probability sensitivities of the response to primitive variables for a unidirectional silicon carbide/reaction-bonded silicon nitride (SiC/RBSN) CMC are presented. The cumulative distribution functions are computed for composite moduli, thermal expansion coefficients, thermal conductivities, and longitudinal tensile strength at room temperature. The variations in the constituent properties that directly affect these composite properties are accounted for via assumed probabilistic distributions. Collectively, the results show that the present technique provides valuable information about the composite properties and sensitivity factors, which is useful to design or test engineers. Furthermore, the present methodology is computationally more efficient than a standard Monte-Carlo simulation technique; and the agreement between the two solutions is excellent, as shown via select examples.

  7. Speech Enhancement Using Gaussian Scale Mixture Models

    PubMed Central

    Hao, Jiucang; Lee, Te-Won; Sejnowski, Terrence J.

    2011-01-01

    This paper presents a novel probabilistic approach to speech enhancement. Instead of a deterministic logarithmic relationship, we assume a probabilistic relationship between the frequency coefficients and the log-spectra. The speech model in the log-spectral domain is a Gaussian mixture model (GMM). The frequency coefficients obey a zero-mean Gaussian whose covariance equals to the exponential of the log-spectra. This results in a Gaussian scale mixture model (GSMM) for the speech signal in the frequency domain, since the log-spectra can be regarded as scaling factors. The probabilistic relation between frequency coefficients and log-spectra allows these to be treated as two random variables, both to be estimated from the noisy signals. Expectation-maximization (EM) was used to train the GSMM and Bayesian inference was used to compute the posterior signal distribution. Because exact inference of this full probabilistic model is computationally intractable, we developed two approaches to enhance the efficiency: the Laplace method and a variational approximation. The proposed methods were applied to enhance speech corrupted by Gaussian noise and speech-shaped noise (SSN). For both approximations, signals reconstructed from the estimated frequency coefficients provided higher signal-to-noise ratio (SNR) and those reconstructed from the estimated log-spectra produced lower word recognition error rate because the log-spectra fit the inputs to the recognizer better. Our algorithms effectively reduced the SSN, which algorithms based on spectral analysis were not able to suppress. PMID:21359139

  8. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  9. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  10. MONKEY: Identifying conserved transcription-factor binding sitesin multiple alignments using a binding site-specific evolutionarymodel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moses, Alan M.; Chiang, Derek Y.; Pollard, Daniel A.

    2004-10-28

    We introduce a method (MONKEY) to identify conserved transcription-factor binding sites in multispecies alignments. MONKEY employs probabilistic models of factor specificity and binding site evolution, on which basis we compute the likelihood that putative sites are conserved and assign statistical significance to each hit. Using genomes from the genus Saccharomyces, we illustrate how the significance of real sites increases with evolutionary distance and explore the relationship between conservation and function.

  11. Cumulative asbestos exposure for US automobile mechanics involved in brake repair (circa 1950s-2000).

    PubMed

    Finley, Brent L; Richter, Richard O; Mowat, Fionna S; Mlynarek, Steve; Paustenbach, Dennis J; Warmerdam, John M; Sheehan, Patrick J

    2007-11-01

    We analyzed cumulative lifetime exposure to chrysotile asbestos experienced by brake mechanics in the US during the period 1950-2000. Using Monte Carlo methods, cumulative exposures were calculated using the distribution of 8-h time-weighted average exposure concentrations for brake mechanics and the distribution of job tenure data for automobile mechanics. The median estimated cumulative exposures for these mechanics, as predicted by three probabilistic models, ranged from 0.16 to 0.41 fibers per cubic centimeter (f/cm(3)) year for facilities with no dust-control procedures (1970s), and from 0.010 to 0.012 f/cm(3) year for those employing engineering controls (1980s). Upper-bound (95%) estimates for the 1970s and 1980s were 1.96 to 2.79 and 0.07-0.10 f/cm(3) year, respectively. These estimates for US brake mechanics are consistent with, but generally slightly lower than, those reported for European mechanics. The values are all substantially lower than the cumulative exposure of 4.5 f/cm(3) year associated with occupational exposure to 0.1 f/cm(3) of asbestos for 45 years that is currently permitted under the current occupational exposure limits in the US. Cumulative exposures were usually about 100- to 1,000-fold less than those of other occupational groups with asbestos exposure for similar time periods. The cumulative lifetime exposure estimates presented here, combined with the negative epidemiology data for brake mechanics, could be used to refine the risk assessments for chrysotile-exposed populations.

  12. Epigenetic priors for identifying active transcription factor binding sites.

    PubMed

    Cuellar-Partida, Gabriel; Buske, Fabian A; McLeay, Robert C; Whitington, Tom; Noble, William Stafford; Bailey, Timothy L

    2012-01-01

    Accurate knowledge of the genome-wide binding of transcription factors in a particular cell type or under a particular condition is necessary for understanding transcriptional regulation. Using epigenetic data such as histone modification and DNase I, accessibility data has been shown to improve motif-based in silico methods for predicting such binding, but this approach has not yet been fully explored. We describe a probabilistic method for combining one or more tracks of epigenetic data with a standard DNA sequence motif model to improve our ability to identify active transcription factor binding sites (TFBSs). We convert each data type into a position-specific probabilistic prior and combine these priors with a traditional probabilistic motif model to compute a log-posterior odds score. Our experiments, using histone modifications H3K4me1, H3K4me3, H3K9ac and H3K27ac, as well as DNase I sensitivity, show conclusively that the log-posterior odds score consistently outperforms a simple binary filter based on the same data. We also show that our approach performs competitively with a more complex method, CENTIPEDE, and suggest that the relative simplicity of the log-posterior odds scoring method makes it an appealing and very general method for identifying functional TFBSs on the basis of DNA and epigenetic evidence. FIMO, part of the MEME Suite software toolkit, now supports log-posterior odds scoring using position-specific priors for motif search. A web server and source code are available at http://meme.nbcr.net. Utilities for creating priors are at http://research.imb.uq.edu.au/t.bailey/SD/Cuellar2011. t.bailey@uq.edu.au Supplementary data are available at Bioinformatics online.

  13. Use of a probabilistic PBPK/PD model to calculate Data Derived Extrapolation Factors for chlorpyrifos.

    PubMed

    Poet, Torka S; Timchalk, Charles; Bartels, Michael J; Smith, Jordan N; McDougal, Robin; Juberg, Daland R; Price, Paul S

    2017-06-01

    A physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) model combined with Monte Carlo analysis of inter-individual variation was used to assess the effects of the insecticide, chlorpyrifos and its active metabolite, chlorpyrifos oxon in humans. The PBPK/PD model has previously been validated and used to describe physiological changes in typical individuals as they grow from birth to adulthood. This model was updated to include physiological and metabolic changes that occur with pregnancy. The model was then used to assess the impact of inter-individual variability in physiology and biochemistry on predictions of internal dose metrics and quantitatively assess the impact of major sources of parameter uncertainty and biological diversity on the pharmacodynamics of red blood cell acetylcholinesterase inhibition. These metrics were determined in potentially sensitive populations of infants, adult women, pregnant women, and a combined population of adult men and women. The parameters primarily responsible for inter-individual variation in RBC acetylcholinesterase inhibition were related to metabolic clearance of CPF and CPF-oxon. Data Derived Extrapolation Factors that address intra-species physiology and biochemistry to replace uncertainty factors with quantitative differences in metrics were developed in these same populations. The DDEFs were less than 4 for all populations. These data and modeling approach will be useful in ongoing and future human health risk assessments for CPF and could be used for other chemicals with potential human exposure. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Flood model for Brazil

    NASA Astrophysics Data System (ADS)

    Palán, Ladislav; Punčochář, Petr

    2017-04-01

    Looking on the impact of flooding from the World-wide perspective, in last 50 years flooding has caused over 460,000 fatalities and caused serious material damage. Combining economic loss from ten costliest flood events (from the same period) returns a loss (in the present value) exceeding 300bn USD. Locally, in Brazil, flood is the most damaging natural peril with alarming increase of events frequencies as 5 out of the 10 biggest flood losses ever recorded have occurred after 2009. The amount of economic and insured losses particularly caused by various flood types was the key driver of the local probabilistic flood model development. Considering the area of Brazil (being 5th biggest country in the World) and the scattered distribution of insured exposure, a domain covered by the model was limited to the entire state of Sao Paolo and 53 additional regions. The model quantifies losses on approx. 90 % of exposure (for regular property lines) of key insurers. Based on detailed exposure analysis, Impact Forecasting has developed this tool using long term local hydrological data series (Agencia Nacional de Aguas) from riverine gauge stations and digital elevation model (Instituto Brasileiro de Geografia e Estatística). To provide most accurate representation of local hydrological behaviour needed for the nature of probabilistic simulation, a hydrological data processing focused on frequency analyses of seasonal peak flows - done by fitting appropriate extreme value statistical distribution and stochastic event set generation consisting of synthetically derived flood events respecting realistic spatial and frequency patterns visible in entire period of hydrological observation. Data were tested for homogeneity, consistency and for any significant breakpoint occurrence in time series so the entire observation or only its subparts were used for further analysis. The realistic spatial patterns of stochastic events are reproduced through the innovative use of d-vine copula scheme to generate probabilistic flood event set. The derived design flows for selected rivers inside model domain were used as an input for 2-dimensional hydrodynamic inundation modelling techniques (using the tool TUFLOW by BMT WBM) on mesh size 30 x 30 metres. Outputs from inundation modelling and stochastic event set were implemented in the Aon Benfield's platform ELEMENTS developed and managed internally by Impact Forecasting; Aon Benfield internal catastrophe model development center. The model was designed to evaluate potential financial impact caused by fluvial flooding on portfolios of insurance and/or reinsurance companies. The structure of presented model follows typical scheme of financial loss catastrophe model and combines hazard with exposure and vulnerability to produce potential financial loss expressed in the form of loss exceedance probability curve and many other insured perspectives, such as average annual loss, event or quantile loss tables and etc. Model can take financial inputs as well as provide split of results for exact specified location or related higher administrative units: municipalities and 5-digit postal codes.

  15. Probabilistic biological network alignment.

    PubMed

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.

  16. Bayesian networks and information theory for audio-visual perception modeling.

    PubMed

    Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis

    2010-09-01

    Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.

  17. Recommendation based on trust diffusion model.

    PubMed

    Yuan, Jinfeng; Li, Li

    2014-01-01

    Recommender system is emerging as a powerful and popular tool for online information relevant to a given user. The traditional recommendation system suffers from the cold start problem and the data sparsity problem. Many methods have been proposed to solve these problems, but few can achieve satisfactory efficiency. In this paper, we present a method which combines the trust diffusion (DiffTrust) algorithm and the probabilistic matrix factorization (PMF). DiffTrust is first used to study the possible diffusions of trust between various users. It is able to make use of the implicit relationship of the trust network, thus alleviating the data sparsity problem. The probabilistic matrix factorization (PMF) is then employed to combine the users' tastes with their trusted friends' interests. We evaluate the algorithm on Flixster, Moviedata, and Epinions datasets, respectively. The experimental results show that the recommendation based on our proposed DiffTrust + PMF model achieves high performance in terms of the root mean square error (RMSE), Recall, and F Measure.

  18. Recommendation Based on Trust Diffusion Model

    PubMed Central

    Li, Li

    2014-01-01

    Recommender system is emerging as a powerful and popular tool for online information relevant to a given user. The traditional recommendation system suffers from the cold start problem and the data sparsity problem. Many methods have been proposed to solve these problems, but few can achieve satisfactory efficiency. In this paper, we present a method which combines the trust diffusion (DiffTrust) algorithm and the probabilistic matrix factorization (PMF). DiffTrust is first used to study the possible diffusions of trust between various users. It is able to make use of the implicit relationship of the trust network, thus alleviating the data sparsity problem. The probabilistic matrix factorization (PMF) is then employed to combine the users' tastes with their trusted friends' interests. We evaluate the algorithm on Flixster, Moviedata, and Epinions datasets, respectively. The experimental results show that the recommendation based on our proposed DiffTrust + PMF model achieves high performance in terms of the root mean square error (RMSE), Recall, and F Measure. PMID:25009827

  19. Transport and fate of radionuclides in aquatic environments--the use of ecosystem modelling for exposure assessments of nuclear facilities.

    PubMed

    Kumblad, L; Kautsky, U; Naeslund, B

    2006-01-01

    In safety assessments of nuclear facilities, a wide range of radioactive isotopes and their potential hazard to a large assortment of organisms and ecosystem types over long time scales need to be considered. Models used for these purposes have typically employed approaches based on generic reference organisms, stylised environments and transfer functions for biological uptake exclusively based on bioconcentration factors (BCFs). These models are of non-mechanistic nature and involve no understanding of uptake and transport processes in the environment, which is a severe limitation when assessing real ecosystems. In this paper, ecosystem models are suggested as a method to include site-specific data and to facilitate the modelling of dynamic systems. An aquatic ecosystem model for the environmental transport of radionuclides is presented and discussed. With this model, driven and constrained by site-specific carbon dynamics and three radionuclide specific mechanisms: (i) radionuclide uptake by plants, (ii) excretion by animals, and (iii) adsorption to organic surfaces, it was possible to estimate the radionuclide concentrations in all components of the modelled ecosystem with only two radionuclide specific input parameters (BCF for plants and Kd). The importance of radionuclide specific mechanisms for the exposure to organisms was examined, and probabilistic and sensitivity analyses to assess the uncertainties related to ecosystem input parameters were performed. Verification of the model suggests that this model produces analogous results to empirically derived data for more than 20 different radionuclides.

  20. Challenges in assessing the health risks of consuming vegetables in metal-contaminated environments.

    PubMed

    Augustsson, Anna; Uddh-Söderberg, Terese; Filipsson, Monika; Helmfrid, Ingela; Berglund, Marika; Karlsson, Helen; Hogmalm, Johan; Karlsson, Andreas; Alriksson, Stina

    2018-04-01

    A great deal of research has been devoted to the characterization of metal exposure due to the consumption of vegetables from urban or industrialized areas. It may seem comforting that concentrations in crops, as well as estimated exposure levels, are often found to be below permissible limits. However, we show that even a moderate increase in metal accumulation in crops may result in a significant increase in exposure. We also highlight the importance of assessing exposure levels in relation to a regional baseline. We have analyzed metal (Pb, Cd, As) concentrations in nearly 700 samples from 23 different vegetables, fruits, berries and mushrooms, collected near 21 highly contaminated industrial sites and from reference sites. Metal concentrations generally complied with permissible levels in commercial food and only Pb showed overall higher concentrations around the contaminated sites. Nevertheless, probabilistic exposure assessments revealed that the exposure to all three metals was significantly higher in the population residing around the contaminated sites, for both low-, median- and high consumers. The exposure was about twice as high for Pb and Cd, and four to six times as high for As. Since vegetable consumption alone did not result in exposure above tolerable intakes, it would have been easy to conclude that there is no risk associated with consuming vegetables grown near the contaminated sites. However, when the increase in exposure is quantified, its potential significance is harder to dismiss - especially when considering that exposure via other routes may be elevated in a similar way. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  2. Anesthesia patient risk: a quantitative approach to organizational factors and risk management options.

    PubMed

    Paté-Cornell, M E; Lakats, L M; Murphy, D M; Gaba, D M

    1997-08-01

    The risk of death or brain damage to anesthesia patients is relatively low, particularly for healthy patients in modern hospitals. When an accident does occur, its cause is usually an error made by the anesthesiologist, either in triggering the accident sequence, or failing to take timely corrective measures. This paper presents a pilot study which explores the feasibility of extending probabilistic risk analysis (PRA) of anesthesia accidents to assess the effects of human and management components on the patient risk. We develop first a classic PRA model for the patient risk per operation. We then link the probabilities of the different accident types to their root causes using a probabilistic analysis of the performance shaping factors. These factors are described here as the "state of the anesthesiologist" characterized both in terms of alertness and competence. We then analyze the effects of different management factors that affect the state of the anesthesiologist and we compute the risk reduction benefits of several risk management policies. Our data sources include the published version of the Australian Incident Monitoring Study as well as expert opinions. We conclude that patient risk could be reduced substantially by closer supervision of residents, the use of anesthesia simulators both in training and for periodic recertification, and regular medical examinations for all anesthesiologists.

  3. The probabilistic nature of preferential choice.

    PubMed

    Rieskamp, Jörg

    2008-11-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.

  4. Probabilistic framework for assessing the arsenic exposure risk from cooked fish consumption.

    PubMed

    Ling, Min-Pei; Wu, Chiu-Hua; Chen, Szu-Chieh; Chen, Wei-Yu; Chio, Chia-Pin; Cheng, Yi-Hsien; Liao, Chung-Min

    2014-12-01

    Geogenic arsenic (As) contamination of groundwater is a major ecological and human health problem in southwestern and northeastern coastal areas of Taiwan. Here, we present a probabilistic framework for assessing the human health risks from consuming raw and cooked fish that were cultured in groundwater As-contaminated ponds in Taiwan by linking a physiologically based pharmacokinetics model and a Weibull dose-response model. Results indicate that As levels in baked, fried, and grilled fish were higher than those of raw fish. Frying resulted in the greatest increase in As concentration, followed by grilling, with baking affecting the As concentration the least. Simulation results show that, following consumption of baked As-contaminated fish, the health risk to humans is <10(-6) excess bladder cancer risk level for lifetime exposure; as the incidence ratios of liver and lung cancers are generally acceptable at risk ranging from 10(-6) to 10(-4), the consumption of baked As-contaminated fish is unlikely to pose a significant risk to human health. However, contaminated fish cooked by frying resulted in significant health risks, showing the highest cumulative incidence ratios of liver cancer. We also show that males have higher cumulative incidence ratio of liver cancer than females. We found that although cooking resulted in an increase for As levels in As-contaminated fish, the risk to human health of consuming baked fish is nevertheless acceptable. We suggest the adoption of baking as a cooking method and warn against frying As-contaminated fish. We conclude that the concentration of contaminants after cooking should be taken into consideration when assessing the risk to human health.

  5. Developing a Malaysia flood model

    NASA Astrophysics Data System (ADS)

    Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina

    2014-05-01

    Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.

  6. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  7. Assessing State Nuclear Weapons Proliferation: Using Bayesian Network Analysis of Social Factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, Garill A.; Brothers, Alan J.; Olson, Jarrod

    A Bayesian network (BN) model of social factors can support proliferation assessments by estimating the likelihood that a state will pursue a nuclear weapon. Social factors including political, economic, nuclear capability, security, and national identity and psychology factors may play as important a role in whether a State pursues nuclear weapons as more physical factors. This paper will show how using Bayesian reasoning on a generic case of a would-be proliferator State can be used to combine evidence that supports proliferation assessment. Theories and analysis by political scientists can be leveraged in a quantitative and transparent way to indicate proliferationmore » risk. BN models facilitate diagnosis and inference in a probabilistic environment by using a network of nodes and acyclic directed arcs between the nodes whose connections, or absence of, indicate probabilistic relevance, or independence. We propose a BN model that would use information from both traditional safeguards and the strengthened safeguards associated with the Additional Protocol to indicate countries with a high risk of proliferating nuclear weapons. This model could be used in a variety of applications such a prioritization tool and as a component of state safeguards evaluations. This paper will discuss the benefits of BN reasoning, the development of Pacific Northwest National Laboratory’s (PNNL) BN state proliferation model and how it could be employed as an analytical tool.« less

  8. Assessing and reporting uncertainties in dietary exposure analysis - Part II: Application of the uncertainty template to a practical example of exposure assessment.

    PubMed

    Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne

    2017-11-01

    A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Comparative risk assessment of the first-generation anticoagulant rodenticide diphacinone to raptors

    USGS Publications Warehouse

    Rattner, Barnett A.; Lazarus, Rebecca S.; Eisenreich, Karen M.; Horak, Katherine E.; Volker, Steven F.; Campton, Christopher M.; Eisemann, John D.; Meteyer, Carol U.; Johnson, John J.

    2012-01-01

    New regulatory restrictions have been placed on the use of some second-generation anticoagulant rodenticides in the United States, and in some situations this action may be offset by expanded use of first-generation compounds. We have recently conducted several studies with captive adult American kestrels and eastern screech-owls examining the toxicity of diphacinone (DPN) using both acute oral and short-term dietary exposure regimens. Diphacinone evoked overt signs of intoxication and lethality in these raptors at exposure doses that were 20 to 30 times lower than reported for traditionally used wildlife test species (mallard and northern bobwhite). Sublethal exposure of kestrels and owls resulted in prolonged clotting time, reduced hematocrit, and/or gross and histological evidence of hemorrhage at daily doses as low as 0.16 mg DPN/kg body weight. Findings also demonstrated that DPN was far more potent in short-term 7-day dietary studies than in single-day acute oral exposure studies. Incorporating these kestrel and owl data into deterministic and probabilistic risk assessments indicated that the risks associated with DPN exposure for raptors are far greater than predicted in analyses using data from mallards and bobwhite. These findings can assist natural resource managers in weighing the costs and benefits of anticoagulant rodenticide use in pest control and eradication programs.

  10. Modeling Flight Attendants’ Exposures to Pesticide in Disinsected Aircraft Cabins

    PubMed Central

    Zhang, Yong; Isukapalli, Sastry; Georgopoulos, Panos; Weisel, Clifford

    2014-01-01

    Aircraft cabin disinsection is required by some countries to kill insects that may pose risks to public health and native ecological systems. A probabilistic model has been developed by considering the microenvironmental dynamics of the pesticide in conjunction with the activity patterns of flight attendants, to assess their exposures and risks to pesticide in disinsected aircraft cabins under three scenarios of pesticide application. Main processes considered in the model are microenvironmental transport and deposition, volatilization, and transfer of pesticide when passengers and flight attendants come in contact with the cabin surfaces. The simulated pesticide airborne mass concentration and surface mass loadings captured measured ranges reported in the literature. The medians (means±standard devitions) of daily total exposures intakes were 0.24 (3.8±10.0), 1.4 (4.2±5.7) and 0.15 (2.1±3.2) μg/(day kg BW) for scenarios of Residual Application, Preflight and Top-of-Descent spraying, respectively. Exposure estimates were sensitive to parameters corresponding to pesticide deposition, body surface area and weight, surface-to-body transfer efficiencies, and efficiency of adherence to skin. Preflight spray posed 2.0 and 3.1 times higher pesticide exposure risk levels for flight attendants in disinsected aircraft cabins than Top-of-Descent spray and Residual Application, respectively. PMID:24251734

  11. Consumption of cosmetic products by the French population. Third part: Product exposure amount.

    PubMed

    Dornic, N; Ficheux, A S; Roudot, A C

    2017-08-01

    A recent study in France provided valuable data on the frequency and amount of use of cosmetic products (Ficheux et al., 2015, 2016a). The aim of the present study was to generate Product Exposure Amount data, i.e. the amounts of cosmetics applied to the skin among the French population using the raw data collected during the previous enquiry. These data are useful to derive Consumer exposure level data which are paramount for skin sensitization risk assessments. Exposure data were generated for 69 different cosmetics, classified as products for the hair, face, buccal hygiene, hands, feet, body, shaving and depilation, sunscreens as well as products specifically intended for babies. Exposure was calculated using a probabilistic Monte Carlo method. The main strength of this work was the break-down of data by age and sex. The results showed that some data used by the International Fragrance Association in skin sensitization risk assessments, in particular facial care products and deodorants, could be unsuitable for the protection of French consumers. For the first time, data were also generated for products intended for babies' nappy area. These data will be useful for the implementation of the Quantitative Risk Assessment for skin sensitization among the French population. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Design of Critical Components

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.

    2001-01-01

    Critical component design is based on minimizing product failures that results in loss of life. Potential catastrophic failures are reduced to secondary failures where components removed for cause or operating time in the system. Issues of liability and cost of component removal become of paramount importance. Deterministic design with factors of safety and probabilistic design address but lack the essential characteristics for the design of critical components. In deterministic design and fabrication there are heuristic rules and safety factors developed over time for large sets of structural/material components. These factors did not come without cost. Many designs failed and many rules (codes) have standing committees to oversee their proper usage and enforcement. In probabilistic design, not only are failures a given, the failures are calculated; an element of risk is assumed based on empirical failure data for large classes of component operations. Failure of a class of components can be predicted, yet one can not predict when a specific component will fail. The analogy is to the life insurance industry where very careful statistics are book-kept on classes of individuals. For a specific class, life span can be predicted within statistical limits, yet life-span of a specific element of that class can not be predicted.

  13. Students’ difficulties in probabilistic problem-solving

    NASA Astrophysics Data System (ADS)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  14. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  15. A Probabilistic Typhoon Risk Model for Vietnam

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  16. Validation analysis of probabilistic models of dietary exposure to food additives.

    PubMed

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  17. Bayesian Estimation of Small Effects in Exercise and Sports Science.

    PubMed

    Mengersen, Kerrie L; Drovandi, Christopher C; Robert, Christian P; Pyne, David B; Gore, Christopher J

    2016-01-01

    The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a 'magnitude-based inference' approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.

  18. Importance of a canteen lunch on the dietary intake of acrylamide.

    PubMed

    Mestdagh, Frédéric; Lachat, Carl; Baert, Katleen; Moons, Emmanuelle; Kolsteren, Patrick; Van Peteghem, Carlos; De Meulenaer, Bruno

    2007-05-01

    A food and drink intake survey was carried out among university students and staff members. Consumption data were collected on days when the participants took hot lunch in a university canteen. The dietary acrylamide exposure was calculated through a probabilistic approach and revealed a median intake of 0.40 microg/kg bw/day [90% confidence interval: 0.36-0.44], which is in accordance with previous exposure calculations. Biscuits (35.4%), French fries (29.9%), bread (23.5%), and chocolate (11.2%) were identified to be the main sources of dietary acrylamide. Foodstuffs consumed in between the three main meals of the day (so called snack type foods) contributed the most to the intake (42.2%). The exposure was lower in an intervention group which received free portions of fruit and vegetables, indicating that a nutritionally balanced diet may contribute to a decreased acrylamide intake. French fries had a significant impact on the acrylamide intake, due to the frequent consumption in the canteen. This demonstrates the important responsibility of caterers and canteen kitchens in the mitigation of acrylamide exposure through reduction of acrylamide in their prepared products, in particular in French fries.

  19. Risk assessment for adult butterflies exposed to the mosquito control pesticide naled

    USGS Publications Warehouse

    Bargar, Timothy A.

    2012-01-01

    A prospective risk assessment was conducted for adult butterflies potentially exposed to the mosquito control insecticide naled. Published acute mortality data, exposure data collected during field studies, and morphometric data (total surface area and fresh body weight) for adult butterflies were combined in a probabilistic estimate of the likelihood that adult butterfly exposure to naled following aerial applications would exceed levels associated with acute mortality. Adult butterfly exposure was estimated based on the product of (1) naled residues on samplers and (2) an exposure metric that normalized total surface area for adult butterflies to their fresh weight. The likelihood that the 10th percentile refined effect estimate for adult butterflies exposed to naled would be exceeded following aerial naled applications was 67 to 80%. The greatest risk would be for butterflies in the family Lycaenidae, and the lowest risk would be for those in the family Hesperidae, assuming equivalent sensitivity to naled. A range of potential guideline naled deposition levels is presented that, if not exceeded, would reduce the risk of adult butterfly mortality. The results for this risk assessment were compared with other risk estimates for butterflies, and the implications for adult butterflies in areas targeted by aerial naled applications are discussed.

  20. Use of computer models to assess exposure to agricultural chemicals via drinking water.

    PubMed

    Gustafson, D I

    1995-10-27

    Surveys of drinking water quality throughout the agricultural regions of the world have revealed the tendency of certain crop protection chemicals to enter water supplies. Fortunately, the trace concentrations that have been detected are generally well below the levels thought to have any negative impact on human health or the environment. However, the public expects drinking water to be pristine and seems willing to bear the costs involved in further regulating agricultural chemical use in such a way so as to eliminate the potential for such materials to occur at any detectable level. Of all the tools available to assess exposure to agricultural chemicals via drinking water, computer models are one of the most cost-effective. Although not sufficiently predictive to be used in the absence of any field data, such computer programs can be used with some degree of certainty to perform quantitative extrapolations and thereby quantify regional exposure from field-scale monitoring information. Specific models and modeling techniques will be discussed for performing such exposure analyses. Improvements in computer technology have recently made it practical to use Monte Carlo and other probabilistic techniques as a routine tool for estimating human exposure. Such methods make it possible, at least in principle, to prepare exposure estimates with known confidence intervals and sufficient statistical validity to be used in the regulatory management of agricultural chemicals.

  1. Contribution of inorganic arsenic sources to population exposure risk on a regional scale.

    PubMed

    Chou, Wei-Chun; Chen, Jein-Wen; Liao, Chung-Min

    2016-07-01

    Chronic exposure to inorganic arsenic (iAs) in the human population is associated with various internal cancers and other adverse outcomes. The purpose of this study was to estimate a population-scale exposure risk attributable to iAs consumptions by linking a stochastic physiological-based pharmacokinetic (PBPK) model and biomonitoring data of iAs in urine. The urinary As concentrations were obtained from a total of 1,043 subjects living in an industrial area of Taiwan. The results showed that the study subjects had an iAs exposure risk of 27 % (the daily iAs intake for 27 % study subjects exceeded the WHO-recommended value, 2.1 μg iAs day(-1) kg(-1) body weight). Moreover, drinking water and cooked rice contributed to the iAs exposure risk by 10 and 41 %, respectively. The predicted risks in the current study were 4.82, 27.21, 34.69, and 64.17 %, respectively, among the mid-range of Mexico, Taiwan (this study), Korea, and Bangladesh reported in the literature. In conclusion, we developed a population-scale-based risk model that covered the broad range of iAS exposure by integrating stochastic PBPK modeling and reverse dosimetry to generate probabilistic distribution of As intake corresponding to urinary As measured from the cohort study. The model can also be updated as new urinary As information becomes available.

  2. Applicability of a neuroprobabilistic integral risk index for the environmental management of polluted areas: a case study.

    PubMed

    Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L

    2008-04-01

    Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.

  3. Boosting Probabilistic Graphical Model Inference by Incorporating Prior Knowledge from Multiple Sources

    PubMed Central

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291

  4. Incorporating High-Throughput Exposure Predictions With Dosimetry-Adjusted In Vitro Bioactivity to Inform Chemical Toxicity Testing

    PubMed Central

    Wetmore, Barbara A.; Wambaugh, John F.; Allen, Brittany; Ferguson, Stephen S.; Sochaski, Mark A.; Setzer, R. Woodrow; Houck, Keith A.; Strope, Cory L.; Cantwell, Katherine; Judson, Richard S.; LeCluyse, Edward; Clewell, Harvey J.; Thomas, Russell S.; Andersen, Melvin E.

    2015-01-01

    We previously integrated dosimetry and exposure with high-throughput screening (HTS) to enhance the utility of ToxCast HTS data by translating in vitro bioactivity concentrations to oral equivalent doses (OEDs) required to achieve these levels internally. These OEDs were compared against regulatory exposure estimates, providing an activity-to-exposure ratio (AER) useful for a risk-based ranking strategy. As ToxCast efforts expand (ie, Phase II) beyond food-use pesticides toward a wider chemical domain that lacks exposure and toxicity information, prediction tools become increasingly important. In this study, in vitro hepatic clearance and plasma protein binding were measured to estimate OEDs for a subset of Phase II chemicals. OEDs were compared against high-throughput (HT) exposure predictions generated using probabilistic modeling and Bayesian approaches generated by the U.S. Environmental Protection Agency (EPA) ExpoCast program. This approach incorporated chemical-specific use and national production volume data with biomonitoring data to inform the exposure predictions. This HT exposure modeling approach provided predictions for all Phase II chemicals assessed in this study whereas estimates from regulatory sources were available for only 7% of chemicals. Of the 163 chemicals assessed in this study, 3 or 13 chemicals possessed AERs < 1 or < 100, respectively. Diverse bioactivities across a range of assays and concentrations were also noted across the wider chemical space surveyed. The availability of HT exposure estimation and bioactivity screening tools provides an opportunity to incorporate a risk-based strategy for use in testing prioritization. PMID:26251325

  5. Modeling population exposures to silver nanoparticles present in consumer products

    NASA Astrophysics Data System (ADS)

    Royce, Steven G.; Mukherjee, Dwaipayan; Cai, Ting; Xu, Shu S.; Alexander, Jocelyn A.; Mi, Zhongyuan; Calderon, Leonardo; Mainelis, Gediminas; Lee, KiBum; Lioy, Paul J.; Tetley, Teresa D.; Chung, Kian Fan; Zhang, Junfeng; Georgopoulos, Panos G.

    2014-11-01

    Exposures of the general population to manufactured nanoparticles (MNPs) are expected to keep rising due to increasing use of MNPs in common consumer products (PEN 2014). The present study focuses on characterizing ambient and indoor population exposures to silver MNPs (nAg). For situations where detailed, case-specific exposure-related data are not available, as in the present study, a novel tiered modeling system, Prioritization/Ranking of Toxic Exposures with GIS (geographic information system) Extension (PRoTEGE), has been developed: it employs a product life cycle analysis (LCA) approach coupled with basic human life stage analysis (LSA) to characterize potential exposures to chemicals of current and emerging concern. The PRoTEGE system has been implemented for ambient and indoor environments, utilizing available MNP production, usage, and properties databases, along with laboratory measurements of potential personal exposures from consumer spray products containing nAg. Modeling of environmental and microenvironmental levels of MNPs employs probabilistic material flow analysis combined with product LCA to account for releases during manufacturing, transport, usage, disposal, etc. Human exposure and dose characterization further employ screening microenvironmental modeling and intake fraction methods combined with LSA for potentially exposed populations, to assess differences associated with gender, age, and demographics. Population distributions of intakes, estimated using the PRoTEGE framework, are consistent with published individual-based intake estimates, demonstrating that PRoTEGE is capable of capturing realistic exposure scenarios for the US population. Distributions of intakes are also used to calculate biologically relevant population distributions of uptakes and target tissue doses through human airway dosimetry modeling that takes into account product MNP size distributions and age-relevant physiological parameters.

  6. Socioeconomic Status and Childhood Cancer Incidence: A Population-Based Multilevel Analysis.

    PubMed

    Kehm, Rebecca D; Spector, Logan G; Poynter, Jenny N; Vock, David M; Osypuk, Theresa L

    2018-05-01

    The etiology of childhood cancers remains largely unknown, especially regarding environmental and behavioral risk factors. Unpacking the association between socioeconomic status (SES) and incidence may offer insight into such etiology. We tested associations between SES and childhood cancer incidence in a population-based case-cohort study (source cohort: Minnesota birth registry, 1989-2014). Cases, ages 0-14 years, were linked from the Minnesota Cancer Surveillance System to birth records through probabilistic record linkage. Controls were 4:1 frequency matched on birth year (2,947 cases and 11,907 controls). We tested associations of individual-level (maternal education) and neighborhood-level (census tract composite index) SES using logistic mixed models. In crude models, maternal education was positively associated with incidence of acute lymphoblastic leukemia (odds ratio (OR) = 1.10, 95% confidence interval (CI): 1.02, 1.19), central nervous system tumors (OR = 1.12, 95% CI: 1.04, 1.21), and neuroblastoma (OR = 1.15, 95% CI: 1.02, 1.30). Adjustment for established risk factors-including race/ethnicity, maternal age, and birth weight-substantially attenuated these positive associations. Similar patterns were observed for neighborhood-level SES. Conversely, higher maternal education was inversely associated with hepatoblastoma incidence (adjusted OR = 0.70, 95% CI: 0.51, 0.98). Overall, beyond the social patterning of established demographic and pregnancy-related exposures, SES is not strongly associated with childhood cancer incidence.

  7. AN ALTERNATIVE METHOD FOR ESTABLISHING TEFS FOR DIOXIN-LIKE COMPOUNDS. PART 1. EVALUATION OF DECISION ANALYSIS METHODS FOR USE IN WEIGHTING RELATIVE POTENCY DATA

    EPA Science Inventory

    A number of investigators have recently examined the utility of applying probabilistic techniques in the derivation of toxic equivalency factors (TEFs) for polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs) and dioxin-like polychlorinated biphenyls (...

  8. An assessment of health risks associated with arsenic exposure via consumption of homegrown vegetables near contaminated glassworks sites.

    PubMed

    Uddh-Söderberg, Terese E; Gunnarsson, Sara J; Hogmalm, K Johan; Lindegård, M I Boel G; Augustsson, Anna L M

    2015-12-01

    The health risk posed by arsenic in vegetables grown in private gardens near 22 contaminated glassworks sites was investigated in this study. Firstly, vegetable (lettuce and potato) and soil samples were collected and arsenic concentrations measured to characterize the arsenic uptake in the selected crops. Secondly, a probabilistic exposure assessment was conducted to estimate the average daily intake (ADIveg), which was then evaluated against toxicological reference values by the calculation of hazard quotients (HQs) and cancer risks (CRs). The results show that elevated arsenic concentrations in residential garden soils are mirrored by elevated concentrations in vegetables, and that consumption of these vegetables alone may result in an unacceptable cancer risk; the calculated reasonable maximum exposure, for example, corresponded to a cancer incidence 20 times higher than the stated tolerance limit. However, the characterization of risk depends to a great extent on which toxicological reference value is used for comparison, as well as how the exposure is determined. Based on the assumptions made in the present study, the threshold levels for chronic non-carcinogenic or acute effects were not exceeded, but the cancer risks indicated highlight the need for further exposure studies, as dietary intake involves more than just homegrown vegetables and total exposure is a function of more than just one exposure pathway. In addition, glassworks sites--and contaminated sites in general--contain multiple contaminants, affecting the final and total risk. Copyright © 2015. Published by Elsevier B.V.

  9. Probabilistic simulation of the human factor in structural reliability

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Many structural failures have occasionally been attributed to human factors in engineering design, analyses maintenance, and fabrication processes. Every facet of the engineering process is heavily governed by human factors and the degree of uncertainty associated with them. Factors such as societal, physical, professional, psychological, and many others introduce uncertainties that significantly influence the reliability of human performance. Quantifying human factors and associated uncertainties in structural reliability require: (1) identification of the fundamental factors that influence human performance, and (2) models to describe the interaction of these factors. An approach is being developed to quantify the uncertainties associated with the human performance. This approach consists of a multi factor model in conjunction with direct Monte-Carlo simulation.

  10. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  11. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  12. Revealing the underlying drivers of disaster risk: a global analysis

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.

  13. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Physical Exposures, Work Tasks, and OSHA-10 Training Among Temporary and Payroll Construction Workers.

    PubMed

    Caban-Martinez, Alberto J; Santiago, Katerina M; Stillman, Jordan; Moore, Kevin J; Sierra, Danielle A; Chalmers, Juanita; Baniak, Melissa; Jordan, Melissa M

    2018-04-01

    We characterize and compare the self-reported physical exposures, work tasks, and OSHA-10 training in a non-probabilistic sample of temporary and payroll construction workers. In June 2016, a total of 250 payroll and temporary general laborers employed at Florida construction sites completed a survey at the job site as part of the falls reported among minority employees (FRAME) study. Workers employed through temp agencies (57.1%) were significantly more likely to report moving or lifting materials more than 100 pounds than payroll workers (38.5%; P < 0.01). Temporary construction workers with 10-hour OSHA training (22.2%) spent significantly less time with intense hand use/awkward hand posture than temporary workers without 10-hour OSHA training (46.9%; P = 0.048). Temp construction workers with OSHA 10-hour training reported less hazardous physical postures than workers without the same training.

  15. Probabilistic classifiers with high-dimensional data

    PubMed Central

    Kim, Kyung In; Simon, Richard

    2011-01-01

    For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946

  16. Communicating weather forecast uncertainty: Do individual differences matter?

    PubMed

    Grounds, Margaret A; Joslyn, Susan L

    2018-03-01

    Research suggests that people make better weather-related decisions when they are given numeric probabilities for critical outcomes (Joslyn & Leclerc, 2012, 2013). However, it is unclear whether all users can take advantage of probabilistic forecasts to the same extent. The research reported here assessed key cognitive and demographic factors to determine their relationship to the use of probabilistic forecasts to improve decision quality. In two studies, participants decided between spending resources to prevent icy conditions on roadways or risk a larger penalty when freezing temperatures occurred. Several forecast formats were tested, including a control condition with the night-time low temperature alone and experimental conditions that also included the probability of freezing and advice based on expected value. All but those with extremely low numeracy scores made better decisions with probabilistic forecasts. Importantly, no groups made worse decisions when probabilities were included. Moreover, numeracy was the best predictor of decision quality, regardless of forecast format, suggesting that the advantage may extend beyond understanding the forecast to general decision strategy issues. This research adds to a growing body of evidence that numerical uncertainty estimates may be an effective way to communicate weather danger to general public end users. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Classical symmetric fourth degree potential systems in probabilistic evolution theoretical perspective: Most facilitative conicalization and squarification of telescope matrices

    NASA Astrophysics Data System (ADS)

    Gözükırmızı, Coşar; Kırkın, Melike Ebru

    2017-01-01

    Probabilistic evolution theory (PREVTH) provides a powerful framework for the solution of initial value problems of explicit ordinary differential equation sets with second degree multinomial right hand side functions. The use of the recursion between squarified telescope matrices provides the opportunity to obtain accurate results without much effort. Convergence may be considered as one of the drawbacks of PREVTH. It is related to many factors: the initial values and the coefficients in the right hand side functions are the most apparent ones. If a space extension is utilized before PREVTH, the convergence of PREVTH may also be affected by how the space extension is performed. There are works about implementations related to probabilistic evolution and how to improve the convergence by methods like analytic continuation. These works were written before squarification was introduced. Since recursion between squarified telescope matrices has given us the opportunity to obtain results corresponding to relatively higher truncation levels, it is important to obtain and analyze results related to certain problems in different areas of engineering. This manuscript may be considered to be in a series of papers and conference proceedings which serves for this purpose.

  18. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    NASA Astrophysics Data System (ADS)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located about 5 km from the southern boundary of Budapest. The quake caused serious damages in the epicentral area and in the southern districts of the capital. The epicentral area of the earthquake is located along the Danube River. Sand boils were observed in some locations that indicated the occurrence of liquefaction. Because their exact locations were recorded at the time of the earthquake, in situ geotechnical measurements (CPT and SPT) could be performed at two (Dunaharaszti and Taksony) sites. The different types of measurements enabled the probabilistic liquefaction hazard computations at the two studied sites. We have compared the return periods of liquefaction that were computed using different built-in simplified stress based methods.

  19. Approximate probabilistic cellular automata for the dynamics of single-species populations under discrete logisticlike growth with and without weak Allee effects.

    PubMed

    Mendonça, J Ricardo G; Gevorgyan, Yeva

    2017-05-01

    We investigate one-dimensional elementary probabilistic cellular automata (PCA) whose dynamics in first-order mean-field approximation yields discrete logisticlike growth models for a single-species unstructured population with nonoverlapping generations. Beginning with a general six-parameter model, we find constraints on the transition probabilities of the PCA that guarantee that the ensuing approximations make sense in terms of population dynamics and classify the valid combinations thereof. Several possible models display a negative cubic term that can be interpreted as a weak Allee factor. We also investigate the conditions under which a one-parameter PCA derived from the more general six-parameter model can generate valid population growth dynamics. Numerical simulations illustrate the behavior of some of the PCA found.

  20. Probabilistic Structures Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The basic formulation for probabilistic finite element analysis is described and demonstrated on a few sample problems. This formulation is based on iterative perturbation that uses the factorized stiffness on the unperturbed system as the iteration preconditioner for obtaining the solution to the perturbed problem. This approach eliminates the need to compute, store and manipulate explicit partial derivatives of the element matrices and force vector, which not only reduces memory usage considerably, but also greatly simplifies the coding and validation tasks. All aspects for the proposed formulation were combined in a demonstration problem using a simplified model of a curved turbine blade discretized with 48 shell elements, and having random pressure and temperature fields with partial correlation, random uniform thickness, and random stiffness at the root.

  1. Probabilistic assessment of roadway departure risk in a curve

    NASA Astrophysics Data System (ADS)

    Rey, G.; Clair, D.; Fogli, M.; Bernardin, F.

    2011-10-01

    Roadway departure while cornering constitutes a major part of car accidents and casualties in France. Even though drastic policy about overspeeding contributes to reduce accidents, there obviously exist other factors. This article presents the construction of a probabilistic strategy for the roadway departure risk assessment. A specific vehicle dynamic model is developed in which some parameters are modelled by random variables. These parameters are deduced from a sensitivity analysis to ensure an efficient representation of the inherent uncertainties of the system. Then, structural reliability methods are employed to assess the roadway departure risk in function of the initial conditions measured at the entrance of the curve. This study is conducted within the French national road safety project SARI that aims to implement a warning systems alerting the driver in case of dangerous situation.

  2. A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’

    PubMed Central

    2017-01-01

    ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362

  3. Risk trade-offs in fish consumption: a public health perspective.

    PubMed

    Rheinberger, Christoph M; Hammitt, James K

    2012-11-20

    Fish consumption advisories instruct vulnerable consumers to avoid high mercury fish and to limit total fish intake to reduce neurotoxic risk. Consumption data from the U.S. suggest that nontarget consumers also respond to such advice. These consumers reduce exposure to mercury and other toxicants at the cost of reduction in cardioprotective fatty acids. We present a probabilistic model to assess these risk trade-offs. We use NHANES consumption data to simulate exposure to contaminants and nutrients in fish, employ dose-response relationships to convert exposure to health end points, and monetize them using benefit transfer. Our results suggest that newborns gained on average 0.033 IQ points from their mothers' compliance with the prominent FDA/EPA advisory. The welfare gain for a birth cohort is estimated at $386 million. This gain could be fully offset by increments in cardiovascular risk if 0.6% of consumers aged 40 and older reduced fish intake by one monthly meal until they reached the age of 60 or if 0.1% of them permanently reduced fish intake.

  4. Filling gaps in large ecological databases: consequences for the study of global-scale plant functional trait patterns

    NASA Astrophysics Data System (ADS)

    Schrodt, Franziska; Shan, Hanhuai; Fazayeli, Farideh; Karpatne, Anuj; Kattge, Jens; Banerjee, Arindam; Reichstein, Markus; Reich, Peter

    2013-04-01

    With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Advanced Hierarchical Probabilistic Matrix Factorization (aHPMF) on the other hand includes climate and soil information into the matrix factorization by regressing the environmental variables against residuals of the HPMF. One unique opportunity opened up by aHPMF is out-of-sample prediction, where traits can be predicted for specific species at locations different to those sampled in the past. This has potentially far-reaching consequences for the study of global-scale plant functional trait patterns. We test the accuracy and effectiveness of HPMF and aHPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented coverage, the TRY database is still very sparse, severely limiting joint trait analyses. Plant traits are the key to understanding how plants as primary producers adjust to changes in environmental conditions and in turn influence them. Forming the basis for Dynamic Global Vegetation Models (DGVMs), plant traits are also fundamental in global change studies for predicting future ecosystem changes. It is thus imperative that missing data is imputed in as accurate and precise a way as possible. In this study, we show the advantages and disadvantages of applying probabilistic matrix factorization techniques in incorporating hierarchical and environmental information for the prediction of missing plant traits as compared to conventional imputation techniques such as the complete case and mean approaches. We will discuss the implications of using gap-filled data for global-scale studies of plant functional trait - environment relationship as opposed to the above-mentioned conventional techniques, using examples of out-of-sample predictions of foliar Nitrogen across several species' ranges and biomes.

  5. Taking a gamble or playing by the rules: Dissociable prefrontal systems implicated in probabilistic versus deterministic rule-based decisions

    PubMed Central

    Bhanji, Jamil P.; Beer, Jennifer S.; Bunge, Silvia A.

    2014-01-01

    A decision may be difficult because complex information processing is required to evaluate choices according to deterministic decision rules and/or because it is not certain which choice will lead to the best outcome in a probabilistic context. Factors that tax decision making such as decision rule complexity and low decision certainty should be disambiguated for a more complete understanding of the decision making process. Previous studies have examined the brain regions that are modulated by decision rule complexity or by decision certainty but have not examined these factors together in the context of a single task or study. In the present functional magnetic resonance imaging study, both decision rule complexity and decision certainty were varied in comparable decision tasks. Further, the level of certainty about which choice to make (choice certainty) was varied separately from certainty about the final outcome resulting from a choice (outcome certainty). Lateral prefrontal cortex, dorsal anterior cingulate cortex, and bilateral anterior insula were modulated by decision rule complexity. Anterior insula was engaged more strongly by low than high choice certainty decisions, whereas ventromedial prefrontal cortex showed the opposite pattern. These regions showed no effect of the independent manipulation of outcome certainty. The results disambiguate the influence of decision rule complexity, choice certainty, and outcome certainty on activity in diverse brain regions that have been implicated in decision making. Lateral prefrontal cortex plays a key role in implementing deterministic decision rules, ventromedial prefrontal cortex in probabilistic rules, and anterior insula in both. PMID:19781652

  6. Initial Probabilistic Evaluation of Reactor Pressure Vessel Fracture with Grizzly and Raven

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Hoffman, William; Sen, Sonat

    2015-10-01

    The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled reactor pressure vessels (RPVs). Grizzly can be used to model the thermal/mechanical response of an RPV under transient conditions that would be observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtainmore » stress intensity factors, which can in turn be used to assess whether a fracture would initiate at a pre-existing flaw. These capabilities have been demonstrated previously. A typical RPV is likely to contain a large population of pre-existing flaws introduced during the manufacturing process. This flaw population is characterized stastistically through probability density functions of the flaw distributions. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation during a transient event. This report documents initial work to perform probabilistic analysis of RPV fracture during a PTS event using a combination of the RAVEN risk analysis code and Grizzly. This work is limited in scope, considering only a single flaw with deterministic geometry, but with uncertainty introduced in the parameters that influence fracture toughness. These results are benchmarked against equivalent models run in the FAVOR code. When fully developed, the RAVEN/Grizzly methodology for modeling probabilistic fracture in RPVs will provide a general capability that can be used to consider a wider variety of vessel and flaw conditions that are difficult to consider with current tools. In addition, this will provide access to advanced probabilistic techniques provided by RAVEN, including adaptive sampling and parallelism, which can dramatically decrease run times.« less

  7. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.

  8. Integrated presentation of ecological risk from multiple stressors

    PubMed Central

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-01-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic. PMID:27782171

  9. Vitamin D status by sociodemographic factors and body mass index in Mexican women at reproductive age.

    PubMed

    Contreras-Manzano, Alejandra; Villalpando, Salvador; Robledo-Pérez, Ricardo

    2017-01-01

    To describe the prevalence of Vitamin D deficiency (VDD) and insufficiency (VDI), and the main dietary sources of vitamin D (VD) in a probabilistic sample of Mexican women at reproductive age participating in Ensanut 2012, stratified by sociodemographic factors and body mass index (BMI) categories. Serum concentrations of 25-hydroxyvitamin-D(25-OH-D) were determined using an ELISA technique in 4162 women participants of Ensanut 2012 and classified as VDD, VDI or optimal VD status. Sociodemographic, anthropometric and dietary data were also collected. The association between VDD/VDI and sociodemographic and anthropometry factors was assessed adjusting for potential confounders through an estimation of a multinomial logistic regression model. The prevalence of VDD was 36.8%, and that of VDI was 49.8%. The mean dietary intake of VD was 2.56 μg/d. The relative risk ratio (RRR) of VDD or VDI was calculated by a multinomial logistic regression model in 4162 women. The RRR of VDD or VDI were significantly higher in women with overweight (RRR: 1.85 and 1.44, p<0.05), obesity (RRR: 2.94 and 1.93, p<0.001), urban dwelling (RRR:1.68 and 1.31, p<0.06), belonging to the 3rd tertile of income (RRR: 5.32 and 2.22, p<0.001), or of indigenous ethnicity (RRR: 2.86 and 1.70, p<0.05), respectively. The high prevalence of VDD/VDI in Mexican women calls for stronger actions from the health authorities, strengthtening the actual policy of food supplementation and recommending a reasonable amount of sun exposure.

  10. Earthquake parametrics based protection for microfinance disaster management in Indonesia

    NASA Astrophysics Data System (ADS)

    Sedayo, M. H.; Damanik, R.

    2017-07-01

    Financial institutions included microfinance institutions those lend money to people also face the risk when catastrophe event hit their operation area. Liquidity risk when withdrawal amount and Non Performance Loan (NPL) hiking fast in the same time could hit their cash flow. There are products in market that provide backup fund for this kind of situation. Microfinance institution needs a guideline too make contingency plan in their disaster management program. We develop a probabilistic seismic hazard, index and zonation map as a tool to help in making financial disaster impact reduction program for microfinance in Indonesia. GMPE was used to estimate PGA for each Kabupaten points. PGA to MMI conversion was done by applied empirical relationship. We used loan distribution data from Financial Service Authority and Bank Indonesia as exposure in indexing. Index level from this study could be use as rank of urgency. Probabilistic hazard map was used to pricing two backup scenarios and to make a zonation. We proposed three zones with annual average cost 0.0684‰, 0.4236‰ and 1.4064 for first scenario and 0.3588‰, 2.6112‰, and 6.0816‰ for second scenario.

  11. Ecotoxicologically based marine acute water quality criteria for metals intended for protection of coastal areas.

    PubMed

    Durán, I; Beiras, R

    2013-10-01

    Acute water quality criteria (WQC) for the protection of coastal ecosystems are developed on the basis of short-term ecotoxicological data using the most sensitive life stages of representative species from the main taxa of marine water column organisms. A probabilistic approach based on species sensitivity distribution (SSD) curves has been chosen and compared to the WQC obtained applying an assessment factor to the critical toxicity values, i.e. the 'deterministic' approach. The criteria obtained from HC5 values (5th percentile of the SSD) were 1.01 μg/l for Hg, 1.39 μg/l for Cu, 3.83 μg/l for Cd, 25.3 μg/l for Pb and 8.24 μg/l for Zn. Using sensitive early life stages and very sensitive endpoints allowed calculation of WQC for marine coastal ecosystems. These probabilistic WQC, intended to protect 95% of the species in 95% of the cases, were calculated on the basis of a limited ecotoxicological dataset, avoiding the use of large and uncertain assessment factors. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.

  13. The Power of Implicit Social Relation in Rating Prediction of Social Recommender Systems

    PubMed Central

    Reafee, Waleed; Salim, Naomie; Khan, Atif

    2016-01-01

    The explosive growth of social networks in recent times has presented a powerful source of information to be utilized as an extra source for assisting in the social recommendation problems. The social recommendation methods that are based on probabilistic matrix factorization improved the recommendation accuracy and partly solved the cold-start and data sparsity problems. However, these methods only exploited the explicit social relations and almost completely ignored the implicit social relations. In this article, we firstly propose an algorithm to extract the implicit relation in the undirected graphs of social networks by exploiting the link prediction techniques. Furthermore, we propose a new probabilistic matrix factorization method to alleviate the data sparsity problem through incorporating explicit friendship and implicit friendship. We evaluate our proposed approach on two real datasets, Last.Fm and Douban. The experimental results show that our method performs much better than the state-of-the-art approaches, which indicates the importance of incorporating implicit social relations in the recommendation process to address the poor prediction accuracy. PMID:27152663

  14. Knowledge gaps in host-parasite interaction preclude accurate assessment of meat-borne exposure to Toxoplasma gondii.

    PubMed

    Crotta, M; Limon, G; Blake, D P; Guitian, J

    2017-11-16

    Toxoplasma gondii is recognized as a widely prevalent zoonotic parasite worldwide. Although several studies clearly identified meat products as an important source of T. gondii infections in humans, quantitative understanding of the risk posed to humans through the food chain is surprisingly scant. While probabilistic risk assessments for pathogens such as Campylobacter jejuni, Listeria monocytogenes or Escherichia coli have been well established, attempts to quantify the probability of human exposure to T. gondii through consumption of food products of animal origin are at early stages. The biological complexity of the life cycle of T. gondii and limited understanding of several fundamental aspects of the host/parasite interaction, require the adoption of numerous critical assumptions and significant simplifications. In this study, we present a hypothetical quantitative model for the assessment of human exposure to T. gondii through meat products. The model has been conceptualized to capture the dynamics leading to the presence of parasite in meat and, for illustrative purposes, used to estimate the probability of at least one viable cyst occurring in 100g of fresh pork meat in England. Available data, including the results of a serological survey in pigs raised in England were used as a starting point to implement a probabilistic model and assess the fate of the parasite along the food chain. Uncertainty distributions were included to describe and account for the lack of knowledge where necessary. To quantify the impact of the key model inputs, sensitivity and scenario analyses were performed. The overall probability of 100g of a hypothetical edible tissue containing at least 1 cyst was 5.54%. Sensitivity analysis indicated that the variables exerting the greater effect on the output mean were the number of cysts and number of bradyzoites per cyst. Under the best and the worst scenarios, the probability of a single portion of fresh pork meat containing at least 1 viable cyst resulted 1.14% and 9.97% indicating that the uncertainty and lack of data surrounding key input parameters of the model preclude accurate estimation of T. gondii exposure through consumption of meat products. The hypothetical model conceptualized here is coherent with current knowledge of the biology of the parasite. Simulation outputs clearly identify the key gaps in our knowledge of the host-parasite interaction that, when filled, will support quantitative assessments and much needed accurate estimates of the risk of human exposure. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.

    Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less

  16. Demonstration of the Application of Composite Load Spectra (CLS) and Probabilistic Structural Analysis (PSAM) Codes to SSME Heat Exchanger Turnaround Vane

    NASA Technical Reports Server (NTRS)

    Rajagopal, Kadambi R.; DebChaudhury, Amitabha; Orient, George

    2000-01-01

    This report describes a probabilistic structural analysis performed to determine the probabilistic structural response under fluctuating random pressure loads for the Space Shuttle Main Engine (SSME) turnaround vane. It uses a newly developed frequency and distance dependent correlation model that has features to model the decay phenomena along the flow and across the flow with the capability to introduce a phase delay. The analytical results are compared using two computer codes SAFER (Spectral Analysis of Finite Element Responses) and NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) and with experimentally observed strain gage data. The computer code NESSUS with an interface to a sub set of Composite Load Spectra (CLS) code is used for the probabilistic analysis. A Fatigue code was used to calculate fatigue damage due to the random pressure excitation. The random variables modeled include engine system primitive variables that influence the operating conditions, convection velocity coefficient, stress concentration factor, structural damping, and thickness of the inner and outer vanes. The need for an appropriate correlation model in addition to magnitude of the PSD is emphasized. The study demonstrates that correlation characteristics even under random pressure loads are capable of causing resonance like effects for some modes. The study identifies the important variables that contribute to structural alternate stress response and drive the fatigue damage for the new design. Since the alternate stress for the new redesign is less than the endurance limit for the material, the damage due high cycle fatigue is negligible.

  17. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    NASA Technical Reports Server (NTRS)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  18. Prediction of frequency and exposure level of solar particle events.

    PubMed

    Kim, Myung-Hee Y; Hayat, Matthew J; Feiveson, Alan H; Cucinotta, Francis A

    2009-07-01

    For future space missions outside of the Earth's magnetic field, the risk of radiation exposure from solar particle events (SPEs) during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern when designing radiation protection including determining sufficient shielding requirements for astronauts and hardware. While the expected frequency of SPEs is strongly influenced by solar modulation, SPE occurrences themselves are chaotic in nature. We report on a probabilistic modeling approach, where a cumulative expected occurrence curve of SPEs for a typical solar cycle was formed from a non-homogeneous Poisson process model fitted to a database of proton fluence measurements of SPEs that occurred during the past 5 solar cycles (19-23) and those of large SPEs identified from impulsive nitrate enhancements in polar ice. From the fitted model, we then estimated the expected frequency of SPEs at any given proton fluence threshold with energy >30 MeV (Phi(30)) during a defined space mission period. Analytic energy spectra of 34 large SPEs observed in the space era were fitted over broad energy ranges extending to GeV, and subsequently used to calculate the distribution of mGy equivalent (mGy-Eq) dose for a typical blood-forming organ (BFO) inside a spacecraft as a function of total Phi(30) fluence. This distribution was combined with a simulation of SPE events using the Poisson model to estimate the probability of the BFO dose exceeding the NASA 30-d limit of 250 mGy-Eq per 30 d. These results will be useful in implementing probabilistic risk assessment approaches at NASA and guidelines for protection systems for astronauts on future space exploration missions.

  19. Environmental risk assessment of polycyclic musks HHCB and AHTN in consumer product chemicals in China.

    PubMed

    Fan, Ming; Liu, Zhengtao; Dyer, Scott; Xia, Pu; Zhang, Xiaowei

    2017-12-01

    An environmental risk assessment (ERA) framework was recently developed for consumer product chemicals in China using a tiered approach, applying an existing Chinese regulatory qualitative method in Tier Zero and, then, utilizing deterministic and probabilistic methods for Tiers One and Two. The exposure assessment methodology in the framework applied conditions specific to China including physical setting, infrastructure, and consumers' habits and practices. Furthermore, two scenarios were identified for quantitatively assessing environmental exposure: (1) Urban with wastewater treatment, and; (2) Rural without wastewater treatment (i.e., direct-discharge of wastewater). Upon a brief discussion on the framework methodology, this paper primarily presented a case study conducted using this new approach for assessing two fragrance chemicals, the polycyclic musks HHCB (Galaxolide, 1,3,4,6,7,8-hexahydro-4,6,6,7,8,8-hexamethylcyclopenta-[gamma]-2-benzopyran) and AHTN (Tonalide, 7-acetyl-1,1,3,4,4,6-hexamethyl-1,2,3,4-tetrahydronaphthalene). Both HHCB and AHTN are widely used as fragrances in a variety of consumer products in China, and occurrences of both compounds have been reported in wastewater influents, effluents, and sludge, in addition to surface water and sediments across several major metropolitan regions throughout China. This case study illustrated the very conservative nature of Tier Zero, which indicated a high risk potential of the fragrances to receiving water aquatic communities due to the fragrance's non-ready biodegradability and eco-toxicity profiles. However, the higher-tiered assessments (including deterministic and site-specific probabilistic) demonstrated greater environmental realism with the conclusion of HHCB and AHTN posing minimal risk, consistent with local monitoring data as well as a recent similar study conducted in the United States. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Cost-Effectiveness of Pre-exposure HIV Prophylaxis During Pregnancy and Breastfeeding in Sub-Saharan Africa

    PubMed Central

    Wheeler, Stephanie B.; Stranix-Chibanda, Lynda; Hosek, Sybil G.; Watts, D. Heather; Siberry, George K.; Spiegel, Hans M. L.; Stringer, Jeffrey S.; Chi, Benjamin H.

    2016-01-01

    Introduction: Antiretroviral pre-exposure prophylaxis (PrEP) for the prevention of HIV acquisition is cost-effective when delivered to those at substantial risk. Despite a high incidence of HIV infection among pregnant and breastfeeding women in sub-Saharan Africa (SSA), a theoretical increased risk of preterm birth on PrEP could outweigh the HIV prevention benefit. Methods: We developed a decision analytic model to evaluate a strategy of daily oral PrEP during pregnancy and breastfeeding in SSA. We approached the analysis from a health care system perspective across a lifetime time horizon. Model inputs were derived from existing literature and local sources. The incremental cost-effectiveness ratio (ICER) of PrEP versus no PrEP was calculated in 2015 U.S. dollars per disability-adjusted life year (DALY) averted. We evaluated the effect of uncertainty in baseline estimates through one-way and probabilistic sensitivity analyses. Results: PrEP administered to pregnant and breastfeeding women in SSA was cost-effective. In a base case of 10,000 women, the administration of PrEP averted 381 HIV infections but resulted in 779 more preterm births. PrEP was more costly per person ($450 versus $117), but resulted in fewer disability-adjusted life years (DALYs) (3.15 versus 3.49). The incremental cost-effectiveness ratio of $965/DALY averted was below the recommended regional threshold for cost-effectiveness of $6462/DALY. Probabilistic sensitivity analyses demonstrated robustness of the model. Conclusions: Providing PrEP to pregnant and breastfeeding women in SSA is likely cost-effective, although more data are needed about adherence and safety. For populations at high risk of HIV acquisition, PrEP may be considered as part of a broader combination HIV prevention strategy. PMID:27355502

  1. Probabilistic pharmacokinetic models of decompression sickness in humans, part 1: Coupled perfusion-limited compartments.

    PubMed

    Murphy, F Gregory; Hada, Ethan A; Doolette, David J; Howle, Laurens E

    2017-07-01

    Decompression sickness (DCS) is a disease caused by gas bubbles forming in body tissues following a reduction in ambient pressure, such as occurs in scuba diving. Probabilistic models for quantifying the risk of DCS are typically composed of a collection of independent, perfusion-limited theoretical tissue compartments which describe gas content or bubble volume within these compartments. It has been previously shown that 'pharmacokinetic' gas content models, with compartments coupled in series, show promise as predictors of the incidence of DCS. The mechanism of coupling can be through perfusion or diffusion. This work examines the application of five novel pharmacokinetic structures with compartments coupled by perfusion to the prediction of the probability and time of onset of DCS in humans. We optimize these models against a training set of human dive trial data consisting of 4335 exposures with 223 DCS cases. Further, we examine the extrapolation quality of the models on an additional set of human dive trial data consisting of 3140 exposures with 147 DCS cases. We find that pharmacokinetic models describe the incidence of DCS for single air bounce dives better than a single-compartment, perfusion-limited model. We further find the U.S. Navy LEM-NMRI98 is a better predictor of DCS risk for the entire training set than any of our pharmacokinetic models. However, one of the pharmacokinetic models we consider, the CS2T3 model, is a better predictor of DCS risk for single air bounce dives and oxygen decompression dives. Additionally, we find that LEM-NMRI98 outperforms CS2T3 on the extrapolation data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Development of probabilistic multimedia multipathway computer codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; LePoire, D.; Gnanapragasam, E.

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less

  3. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  4. A probabilistic approach for shallow rainfall-triggered landslide modeling at basin scale. A case study in the Luquillo Forest, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Dialynas, Y. G.; Arnone, E.; Noto, L. V.; Bras, R. L.

    2013-12-01

    Slope stability depends on geotechnical and hydrological factors that exhibit wide natural spatial variability, yet sufficient measurements of the related parameters are rarely available over entire study areas. The uncertainty associated with the inability to fully characterize hydrologic behavior has an impact on any attempt to model landslide hazards. This work suggests a way to systematically account for this uncertainty in coupled distributed hydrological-stability models for shallow landslide hazard assessment. A probabilistic approach for the prediction of rainfall-triggered landslide occurrence at basin scale was implemented in an existing distributed eco-hydrological and landslide model, tRIBS-VEGGIE -landslide (Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator - VEGetation Generator for Interactive Evolution). More precisely, we upgraded tRIBS-VEGGIE- landslide to assess the likelihood of shallow landslides by accounting for uncertainty related to geotechnical and hydrological factors that directly affect slope stability. Natural variability of geotechnical soil characteristics was considered by randomizing soil cohesion and friction angle. Hydrological uncertainty related to the estimation of matric suction was taken into account by considering soil retention parameters as correlated random variables. The probability of failure is estimated through an assumed theoretical Factor of Safety (FS) distribution, conditioned on soil moisture content. At each cell, the temporally variant FS statistics are approximated by the First Order Second Moment (FOSM) method, as a function of parameters statistical properties. The model was applied on the Rio Mameyes Basin, located in the Luquillo Experimental Forest in Puerto Rico, where previous landslide analyses have been carried out. At each time step, model outputs include the probability of landslide occurrence across the basin, and the most probable depth of failure at each soil column. The use of the proposed probabilistic approach for shallow landslide prediction is able to reveal and quantify landslide risk at slopes assessed as stable by simpler deterministic methods.

  5. Spatially explicit exposure assessment for small streams in catchments of the orchard growing region `Lake Constance

    NASA Astrophysics Data System (ADS)

    Golla, B.; Bach, M.; Krumpe, J.

    2009-04-01

    1. Introduction Small streams differ greatly from the standardised water body used in the context of aquatic risk assessment for the regulation of plant protection products in Germany. The standard water body is static, with a depth of 0.3 m and a width of 1.0 m. No dilution or water replacement takes place. Spray drift happens always in direction to the water body. There is no variability in drift deposition rate (90th percentile spray drift deposition values [2]). There is no spray drift filtering by vegetation. The application takes place directly adjacent to the water body. In order to establish a more realistic risk assessment procedure the Federal Office for Consumer Protection and Food Safety (BVL) and the Federal Environment Agency (UBA) aggreed to replace deterministic assumptions with data distributions and spatially explicit data and introduce probabilistic methods [3, 4, 5]. To consider the spatial and temporal variability in the exposure situations of small streams the hydraulic and morphological characteristics of catchments need to be described as well as the spatial distribution of fields treated with pesticides. As small streams are the dominant type of water body in most German orchard regions, we use the growing region Lake Constance as pilot region. 2. Materials and methods During field surveys we derive basic morphological parameters for small streams in the Lake Constance region. The mean water width/depth ratio is 13 with a mean depth of 0.12 m. The average residence time is 5.6 s/m (n=87) [1]. Orchards are mostly located in the upper parts of the catchments. Based on an authoritative dataset on rivers and streams of Germany (ATKIS DLM25) we constructed a directed network topology for the Lake Constance region. The gradient of the riverbed is calculated for river stretches of > 500 m length. The network for the pilot region consists of 2000 km rivers and streams. 500 km stream length are located within a distance of 150 m to orchards. Within this distance a spray drift exposure with adverse effects is theoretically possible [6]. The network is segmented to approx. 80'000 segments of 25 m length. One segment is the basic element of the exposure assessment. Based on the Manning-Strickler formula and empirically determined relations two equations are developed to express the width and depth of the streams and the flow velocity [7]. Using Java programming and spatial network analysis within Oracle 10g/Spatial DBMS we developed a tool to simulate concentration over time for all single 25 m segments of the stream network. The analysis considers the spatially explicit upstream exposure situations due to the locations of orchards and recovery areas in the catchments. The application which takes place on a specific orchard is simulated according to realistic application patterns or to the simplistic assumption that all orchards are sprayed on the same day. 3. Results The results of the analysis are distributions of time average concentrations (mPEC) for all single stream segments of the stream network. The averaging time window can be defined flexibly between 1 h (mPEC1h) to 24 h (mPEC24h). Spatial network analysis based on georeferenced hydraulic and morphological parameters proved to be a suitable approach for analysing the exposure situation of streams under more realistic aspects. The time varying concentration of single stream segments can be analysed over a vegetation period or a single day. Stream segments which exceed a trigger concentration or segments with a specific pulse concentration pattern in given time windows can be identified and be addressed by e.g. implementing additional drift mitigation measures. References [1] Golla, B., J. Krumpe, J. Strassemeyer, and V. Gutsche. (2008): Refined exposure assessment of small streams in German orchard regions. Part 1. Results of a hydromorphological survey. Journal für Kulturpflanzen (submitted). [2] Rautmann, D., Streloke, M, and Winkler, R (1999): New basic drift values in the authorization procedure for plant protection products, pp. 133-141. In Workshop on risk management and risk mitigation measures in the context of authorization of plant protection products [3] Klein, A. W., Dechet, F., and Streloke, M (2003): Probabilistic Assessment Method for Risk Analysis in the framework of Plant Protection Product Authorisation, Industrieverband Agrar (IVA, 2006), Frankfurt/Main [4] Schulz R, Stehle S, Elsaesser F, Matezki S, Müller A, Neumann M, Ohliger R, Wogram J, Zenker K. 2008. Geodata-based Probabilistic Risk Assessment and Management of Pesticides in Germany, a Conceptual Framework. IEAM_2008-032R [5] Kubiak, R., Hommen, Bach, M., Classen, G. Fent, H.-G. Frede, A. Gergs, B. Golla, M. Klein, J. Krumpe, S. Matetzki, A. Müller, M. Neumann,T. G. Preuss, H. T. Ratte, M. Roß-Nickoll, S. Reichenberger, C. Schäfers, T. Strauss, A. Toschki, M. Trapp, J. Wogram (2009): A new GIS based approach for the assessment and management of environmental risks of plant protection, SETAC EUROPE Göteborg [6] Enzian, S. ,Golla., B. (2006) A method for the identification and classification of "save distance" cropland to the potential drift exposure of pesticides towards surface waters. UBA-Texte [7] Bach, M., Träbing, K. and Frede, H.-G. (2004): Morphological Characteristics of small rivers in the context of probabilistic exposure assessment. Nachrichtenblatt des Deutschen Pflanzenschutzdienstes 56

  6. Analysis and probabilistic risk assessment of bioaccessible arsenic in polished and husked jasmine rice sold in Bangkok.

    PubMed

    Hensawang, Supanad; Chanpiwat, Penradee

    2018-09-01

    Food is one of the major sources of arsenic (As) exposure in humans. The objectives of this study were to determine the bioaccessible concentration of As in rice grain sold in Bangkok and to evaluate the potential health risks associated with rice consumption. Polished (n = 32) and husked (n = 17) jasmine rice were collected from local markets. In vitro digestion was performed to determine the bioaccessible As concentrations, which were used for probabilistic health risk assessments in different age groups of the population. Approximately 43.0% and 44.4% of the total As in the grain of polished and husked rice, respectively, was in the form of bioaccessible As. Significantly higher bioaccessible As concentrations were found in husked rice than in polished rice (1.5-3.8 times greater). The concentrations of bioaccessible As in polished and husked rice were lower than the Codex standard for As in rice. The average daily dose of As via rice consumption is equivalent to the daily ingestion of 2 L of water containing approximately 3.2-7.2 μg L -1 of As. Approximately 0.2%-13.7% and 10.7%-55.3% of the population may experience non-carcinogenic effects from polished and husked rice consumption, respectively. Approximately 1%-11.6% of children and 74.1%-99.8% of adults were at risk of cancer. The maximum cancer probabilities were 3 children and 6 adults in 10,000 individuals. The probabilistic risk results indicated that children and adults were at risk of both non-carcinogenic and carcinogenic effects from both types of rice consumption. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Using probabilistic modeling to evaluate human exposure to organotin in drinking water transported by polyvinyl chloride pipe.

    PubMed

    Fristachi, Anthony; Xu, Ying; Rice, Glenn; Impellitteri, Christopher A; Carlson-Lynch, Heather; Little, John C

    2009-11-01

    The leaching of organotin (OT) heat stabilizers from polyvinyl chloride (PVC) pipes used in residential drinking water systems may affect the quality of drinking water. These OTs, principally mono- and di-substituted species of butyltins and methyltins, are a potential health concern because they belong to a broad class of compounds that may be immune, nervous, and reproductive system toxicants. In this article, we develop probability distributions of U.S. population exposures to mixtures of OTs encountered in drinking water transported by PVC pipes. We employed a family of mathematical models to estimate OT leaching rates from PVC pipe as a function of both surface area and time. We then integrated the distribution of estimated leaching rates into an exposure model that estimated the probability distribution of OT concentrations in tap waters and the resulting potential human OT exposures via tap water consumption. Our study results suggest that human OT exposures through tap water consumption are likely to be considerably lower than the World Health Organization (WHO) "safe" long-term concentration in drinking water (150 microg/L) for dibutyltin (DBT)--the most toxic of the OT considered in this article. The 90th percentile average daily dose (ADD) estimate of 0.034 +/- 2.92 x 10(-4)microg/kg day is approximately 120 times lower than the WHO-based ADD for DBT (4.2 microg/kg day).

  8. Workplace secondhand smoke exposure: a lingering hazard for young adults in California.

    PubMed

    Holmes, Louisa M; Ling, Pamela M

    2017-03-01

    To examine occupational differences in workplace exposure to secondhand smoke (SHS) among young adults in California. Data are taken from the 2014 Bay Area Young Adult Health Survey, a probabilistic multimode cross-sectional household survey of young adults, aged 18-26, in Alameda and San Francisco Counties. Respondents were asked whether they had been exposed to SHS 'indoors' or 'outdoors' at their workplace in the previous 7 days and also reported their current employment status, industry and occupation. Sociodemographic characteristics and measures of health perception and behaviour were included in the final model. Young adults employed in service (p<0.001), construction and maintenance (p<0.01), and transportation and material moving (p<0.05) sectors were more likely to report workplace SHS exposure while those reporting very good or excellent self-rated health were less likely (p<0.001). Despite California's clean indoor air policy, 33% of young adults in the San Francisco Bay Area still reported workplace SHS exposure in the past week, with those in lower income occupations and working in non-office environments experiencing the greatest exposure. Closing the gaps that exempt certain types of workplaces from the Smoke-Free Workplace Act may be especially beneficial for young adults. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. The influence of number line estimation precision and numeracy on risky financial decision making.

    PubMed

    Park, Inkyung; Cho, Soohyun

    2018-01-10

    This study examined whether different aspects of mathematical proficiency influence one's ability to make adaptive financial decisions. "Numeracy" refers to the ability to process numerical and probabilistic information and is commonly reported as an important factor which contributes to financial decision-making ability. The precision of mental number representation (MNR), measured with the number line estimation (NLE) task has been reported to be another critical factor. This study aimed to examine the contribution of these mathematical proficiencies while controlling for the influence of fluid intelligence, math anxiety and personality factors. In our decision-making task, participants chose between two options offering probabilistic monetary gain or loss. Sensitivity to expected value was measured as an index for the ability to discriminate between optimal versus suboptimal options. Partial correlation and hierarchical regression analyses revealed that NLE precision better explained EV sensitivity compared to numeracy, after controlling for all covariates. These results suggest that individuals with more precise MNR are capable of making more rational financial decisions. We also propose that the measurement of "numeracy," which is commonly used interchangeably with general mathematical proficiency, should include more diverse aspects of mathematical cognition including basic understanding of number magnitude. © 2018 International Union of Psychological Science.

  10. Exploring Global Exposure Factors Resources for Use in Consumer Exposure Assessments.

    PubMed

    Zaleski, Rosemary T; Egeghy, Peter P; Hakkinen, Pertti J

    2016-07-22

    This publication serves as a global comprehensive resource for readers seeking exposure factor data and information relevant to consumer exposure assessment. It describes the types of information that may be found in various official surveys and online and published resources. The relevant exposure factors cover a broad range, including general exposure factor data found in published compendia and databases and resources about specific exposure factors, such as human activity patterns and housing information. Also included are resources on exposure factors related to specific types of consumer products and the associated patterns of use, such as for a type of personal care product or a type of children's toy. Further, a section on using exposure factors for designing representative exposure scenarios is included, along with a look into the future for databases and other exposure science developments relevant for consumer exposure assessment.

  11. Exploring Global Exposure Factors Resources for Use in Consumer Exposure Assessments

    PubMed Central

    Zaleski, Rosemary T.; Egeghy, Peter P.; Hakkinen, Pertti J.

    2016-01-01

    This publication serves as a global comprehensive resource for readers seeking exposure factor data and information relevant to consumer exposure assessment. It describes the types of information that may be found in various official surveys and online and published resources. The relevant exposure factors cover a broad range, including general exposure factor data found in published compendia and databases and resources about specific exposure factors, such as human activity patterns and housing information. Also included are resources on exposure factors related to specific types of consumer products and the associated patterns of use, such as for a type of personal care product or a type of children’s toy. Further, a section on using exposure factors for designing representative exposure scenarios is included, along with a look into the future for databases and other exposure science developments relevant for consumer exposure assessment. PMID:27455300

  12. Probabilistic Ontology Architecture for a Terrorist Identification Decision Support System

    DTIC Science & Technology

    2014-06-01

    in real-world problems requires probabilistic ontologies, which integrate the inferential reasoning power of probabilistic representations with the... inferential reasoning power of probabilistic representations with the first-order expressivity of ontologies. The Reference Architecture for...ontology, terrorism, inferential reasoning, architecture I. INTRODUCTION A. Background Whether by nature or design, the personas of terrorists are

  13. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  14. Exposure Factors Handbook Chapter 19

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  15. Exposure Factors Handbook Chapter 4

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  16. Exposure Factors Handbook Chapter 6

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  17. Exposure Factors Handbook Chapter 2

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  18. Exposure Factors Handbook Chapter 9

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  19. Exposure Factors Handbook Chapter 11

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  20. Exposure Factors Handbook Chapter 14

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  1. Exposure Factors Handbook Chapter 1

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  2. Exposure Factors Handbook Chapter 15

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  3. Exposure Factors Handbook Chapter 12

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  4. Exposure Factors Handbook Chapter 18

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  5. Exposure Factors Handbook Chapter 16

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  6. Exposure Factors Handbook Chapter 7

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  7. Exposure Factors Handbook Chapter 5

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  8. Exposure Factors Handbook Chapter 3

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  9. Exposure Factors Handbook Chapter 8

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  10. Exposure Factors Handbook Chapter 10

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  11. Exposure Factors Handbook Chapter 13

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  12. Exposure Factors Handbook Chapter 17

    EPA Pesticide Factsheets

    Exposure Factors Handbook: 2011 Edition. The Exposure Factors Handbook provides information on various physiological and behavioral factors commonly used in assessing exposure to environmental chemicals.

  13. Chronic Exposure to Methamphetamine Disrupts Reinforcement-Based Decision Making in Rats.

    PubMed

    Groman, Stephanie M; Rich, Katherine M; Smith, Nathaniel J; Lee, Daeyeol; Taylor, Jane R

    2018-03-01

    The persistent use of psychostimulant drugs, despite the detrimental outcomes associated with continued drug use, may be because of disruptions in reinforcement-learning processes that enable behavior to remain flexible and goal directed in dynamic environments. To identify the reinforcement-learning processes that are affected by chronic exposure to the psychostimulant methamphetamine (MA), the current study sought to use computational and biochemical analyses to characterize decision-making processes, assessed by probabilistic reversal learning, in rats before and after they were exposed to an escalating dose regimen of MA (or saline control). The ability of rats to use flexible and adaptive decision-making strategies following changes in stimulus-reward contingencies was significantly impaired following exposure to MA. Computational analyses of parameters that track choice and outcome behavior indicated that exposure to MA significantly impaired the ability of rats to use negative outcomes effectively. These MA-induced changes in decision making were similar to those observed in rats following administration of a dopamine D2/3 receptor antagonist. These data use computational models to provide insight into drug-induced maladaptive decision making that may ultimately identify novel targets for the treatment of psychostimulant addiction. We suggest that the disruption in utilization of negative outcomes to adaptively guide dynamic decision making is a new behavioral mechanism by which MA rigidly biases choice behavior.

  14. Tyramine and histamine risk assessment related to consumption of dry fermented sausages by the Spanish population.

    PubMed

    Latorre-Moratalla, M L; Comas-Basté, O; Bover-Cid, S; Vidal-Carou, M C

    2017-01-01

    Tyramine and histamine are the main dietary bioactive amines related to acute adverse health effects. Dry fermented sausages can easily accumulate high levels of these hazards and are frequently consumed in Spain. The present work aims to assess the exposure to tyramine and histamine from the consumption of dry fermented sausages by the Spanish population and to assess the risk to suffer acute health effects from this exposure. A probabilistic estimation of the exposure to these hazards was derived combining probability distributions of these amines in dry fermented sausages (n = 474) and their consumption by the Spanish population. The mean dietary exposure to tyramine and histamine was 6.2 and 1.39 mg/meal, respectively. The risk of suffering hypertensive crisis or histamine intoxication by healthy population due to tyramine or histamine intake, respectively, exclusively from dry fermented sausages, can be considered negligible. For individuals under treatment with MAOI drugs, the probability to surpass the safe threshold dose (6 mg/meal) was estimated as 34%. For patients with histamine intolerance, even the presence of this amine in food is not tolerable and it could be estimated that 7000 individuals per million could be at risk to suffer the related symptoms after consuming dry fermented sausages. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Risk assessment for adult butterflies exposed to the mosquito control pesticide naled.

    PubMed

    Bargar, Timothy A

    2012-04-01

    A prospective risk assessment was conducted for adult butterflies potentially exposed to the mosquito control insecticide naled. Published acute mortality data, exposure data collected during field studies, and morphometric data (total surface area and fresh body weight) for adult butterflies were combined in a probabilistic estimate of the likelihood that adult butterfly exposure to naled following aerial applications would exceed levels associated with acute mortality. Adult butterfly exposure was estimated based on the product of (1) naled residues on samplers and (2) an exposure metric that normalized total surface area for adult butterflies to their fresh weight. The likelihood that the 10th percentile refined effect estimate for adult butterflies exposed to naled would be exceeded following aerial naled applications was 67 to 80%. The greatest risk would be for butterflies in the family Lycaenidae, and the lowest risk would be for those in the family Hesperidae, assuming equivalent sensitivity to naled. A range of potential guideline naled deposition levels is presented that, if not exceeded, would reduce the risk of adult butterfly mortality. The results for this risk assessment were compared with other risk estimates for butterflies, and the implications for adult butterflies in areas targeted by aerial naled applications are discussed. Copyright © 2012 SETAC.

  16. Cancer risk of polycyclic aromatic hydrocarbons (PAHs) in the soils from Jiaozhou Bay wetland.

    PubMed

    Yang, Wei; Lang, Yinhai; Li, Guoliang

    2014-10-01

    To estimate the cancer risk exposed to the PAHs in Jiaozhou Bay wetland soils, a probabilistic health risk assessment was conducted based on Monte Carlo simulations. A sensitivity analysis was performed to determine the input variables that contribute most to the cancer risk assessment. Three age groups were selected to estimate the cancer risk via four exposure pathways (soil ingestion, food ingestion, dermal contact and inhalation). The results revealed that the 95th percentiles cancer risks for children, teens and adults were 9.11×10(-6), 1.04×10(-5) and 7.08×10(-5), respectively. The cancer risks for three age groups were at acceptable range (10(-6)-10(-4)), indicating no potential cancer risk. For different exposure pathways, food ingestion was the major exposure pathway. For 7 carcinogenic PAHs, the cancer risk caused by BaP was the highest. Sensitivity analysis demonstrated that the parameters of exposure duration (ED) and sum of converted 7 carcinogenic PAHs concentrations in soil based on BaPeq (CSsoil) contribute most to the total uncertainty. This study provides a comprehensive risk assessment on carcinogenic PAHs in Jiaozhou Bay wetland soils, and might be useful in providing potential strategies of cancer risk prevention and controlling. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Development of a new modelling tool (FACET) to assess exposure to chemical migrants from food packaging.

    PubMed

    Oldring, P K T; O'Mahony, C; Dixon, J; Vints, M; Mehegan, J; Dequatre, C; Castle, L

    2014-01-01

    The approach used to obtain European Union-wide data on the usage and concentration of substances in different food packaging materials is described. Statistics were collected on pack sizes and market shares for the different materials used to package different food groups. The packaging materials covered were plastics (both flexible and rigid), metal containers, light metal packaging, paper and board, as well as the adhesives and inks used on them. An explanation as to how these data are linked in various ways in the FACET exposure modelling tool is given as well as an overview of the software along with examples of the intermediate tables of data. The example of bisphenol A (BPA), used in resins that may be incorporated into some coatings for canned foodstuffs, is used to illustrate how the data in FACET are combined to produce concentration distributions. Such concentration distributions are then linked probabilistically to the amounts of each food item consumed, as recorded in national food consumption survey diaries, in order to estimate exposure to packaging migrants. Estimates of exposure are at the level of the individual consumer and thus can be expressed for various percentiles of different populations and subpopulations covered by the national dietary surveys.

  18. International Space Station End-of-Life Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Duncan, Gary W.

    2014-01-01

    The International Space Station (ISS) end-of-life (EOL) cycle is currently scheduled for 2020, although there are ongoing efforts to extend ISS life cycle through 2028. The EOL for the ISS will require deorbiting the ISS. This will be the largest manmade object ever to be de-orbited therefore safely deorbiting the station will be a very complex problem. This process is being planned by NASA and its international partners. Numerous factors will need to be considered to accomplish this such as target corridors, orbits, altitude, drag, maneuvering capabilities etc. The ISS EOL Probabilistic Risk Assessment (PRA) will play a part in this process by estimating the reliability of the hardware supplying the maneuvering capabilities. The PRA will model the probability of failure of the systems supplying and controlling the thrust needed to aid in the de-orbit maneuvering.

  19. A Probabilistic Approach to Interior Regularity of Fully Nonlinear Degenerate Elliptic Equations in Smooth Domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou Wei, E-mail: zhoux123@umn.edu

    2013-06-15

    We consider the value function of a stochastic optimal control of degenerate diffusion processes in a domain D. We study the smoothness of the value function, under the assumption of the non-degeneracy of the diffusion term along the normal to the boundary and an interior condition weaker than the non-degeneracy of the diffusion term. When the diffusion term, drift term, discount factor, running payoff and terminal payoff are all in the class of C{sup 1,1}( D-bar ) , the value function turns out to be the unique solution in the class of C{sub loc}{sup 1,1}(D) Intersection C{sup 0,1}( D-bar )more » to the associated degenerate Bellman equation with Dirichlet boundary data. Our approach is probabilistic.« less

  20. Effects of additional data on Bayesian clustering.

    PubMed

    Yamazaki, Keisuke

    2017-10-01

    Hierarchical probabilistic models, such as mixture models, are used for cluster analysis. These models have two types of variables: observable and latent. In cluster analysis, the latent variable is estimated, and it is expected that additional information will improve the accuracy of the estimation of the latent variable. Many proposed learning methods are able to use additional data; these include semi-supervised learning and transfer learning. However, from a statistical point of view, a complex probabilistic model that encompasses both the initial and additional data might be less accurate due to having a higher-dimensional parameter. The present paper presents a theoretical analysis of the accuracy of such a model and clarifies which factor has the greatest effect on its accuracy, the advantages of obtaining additional data, and the disadvantages of increasing the complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. FRAT-up, a Web-based fall-risk assessment tool for elderly people living in the community.

    PubMed

    Cattelani, Luca; Palumbo, Pierpaolo; Palmerini, Luca; Bandinelli, Stefania; Becker, Clemens; Chesani, Federico; Chiari, Lorenzo

    2015-02-18

    About 30% of people over 65 are subject to at least one unintentional fall a year. Fall prevention protocols and interventions can decrease the number of falls. To be effective, a prevention strategy requires a prior step to evaluate the fall risk of the subjects. Despite extensive research, existing assessment tools for fall risk have been insufficient for predicting falls. The goal of this study is to present a novel web-based fall-risk assessment tool (FRAT-up) and to evaluate its accuracy in predicting falls, within a context of community-dwelling persons aged 65 and up. FRAT-up is based on the assumption that a subject's fall risk is given by the contribution of their exposure to each of the known fall-risk factors. Many scientific studies have investigated the relationship between falls and risk factors. The majority of these studies adopted statistical approaches, usually providing quantitative information such as odds ratios. FRAT-up exploits these numerical results to compute how each single factor contributes to the overall fall risk. FRAT-up is based on a formal ontology that enlists a number of known risk factors, together with quantitative findings in terms of odds ratios. From such information, an automatic algorithm generates a rule-based probabilistic logic program, that is, a set of rules for each risk factor. The rule-based program takes the health profile of the subject (in terms of exposure to the risk factors) and computes the fall risk. A Web-based interface allows users to input health profiles and to visualize the risk assessment for the given subject. FRAT-up has been evaluated on the InCHIANTI Study dataset, a representative population-based study of older persons living in the Chianti area (Tuscany, Italy). We compared reported falls with predicted ones and computed performance indicators. The obtained area under curve of the receiver operating characteristic was 0.642 (95% CI 0.614-0.669), while the Brier score was 0.174. The Hosmer-Lemeshow test indicated statistical significance of miscalibration. FRAT-up is a web-based tool for evaluating the fall risk of people aged 65 or up living in the community. Validation results of fall risks computed by FRAT-up show that its performance is comparable to externally validated state-of-the-art tools. A prototype is freely available through a web-based interface. ClinicalTrials.gov NCT01331512 (The InChianti Follow-Up Study); http://clinicaltrials.gov/show/NCT01331512 (Archived by WebCite at http://www.webcitation.org/6UDrrRuaR).

  2. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    NASA Astrophysics Data System (ADS)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  3. Exposure Factors Resources: Contrasting EPA’s Exposure Factors Handbook with International Sources (Journal Article)

    EPA Science Inventory

    Efforts to compile and standardize exposure human factors have resulted in the development of a variety of resources available to the scientific community. For example, the U.S. EPA developed the Exposure Factors Handbook and Child-specific Exposure Factors Handbook to promote c...

  4. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  5. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  6. Exploring Global Exposure Factors Resources for Use in ...

    EPA Pesticide Factsheets

    This publication serves as a global comprehensive resource for readers seeking exposure factor data and information relevant to consumer exposure assessment. It describes the types of information that may be found in various official surveys and online and published resources. The relevant exposure factors cover a broad range, including general exposure factor data found in published compendia and databases and resources about specific exposure factors, such as human activity patterns and housing information. Also included are resources on exposure factors related to specific types of consumer products and the associated patterns of use, such as for a type of personal care product or a type of children’s toy. Further, a section on using exposure factors for designing representative exposure scenarios is included, along with a look into the future for databases and other exposure science developments relevant for consumer exposure assessment. Review article in the International Journal of Environmental Research and Public Health

  7. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  8. NASA Glenn Research Center Overview

    NASA Technical Reports Server (NTRS)

    Sehra, Arun K.

    2002-01-01

    This viewgraph presentation provides information on the NASA Glenn Research Center. The presentation is a broad overview, including the chain of command at the center, its aeronautics facilities, and the factors which shape aerospace product line integration at the center. Special attention is given to the future development of high fidelity probabilistic methods, and NPSS (Numerical Propulsion System Simulation).

  9. A Mediation Model to Explain the Role of Mathematics Skills and Probabilistic Reasoning on Statistics Achievement

    ERIC Educational Resources Information Center

    Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca

    2016-01-01

    Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…

  10. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  11. Moving Aerospace Structural Design Practice to a Load and Resistance Factor Approach

    NASA Technical Reports Server (NTRS)

    Larsen, Curtis E.; Raju, Ivatury S.

    2016-01-01

    Aerospace structures are traditionally designed using the factor of safety (FOS) approach. The limit load on the structure is determined and the structure is then designed for FOS times the limit load - the ultimate load. Probabilistic approaches utilize distributions for loads and strengths. Failures are predicted to occur in the region of intersection of the two distributions. The load and resistance factor design (LRFD) approach judiciously combines these two approaches by intensive calibration studies on loads and strength to result in structures that are efficient and reliable. This paper discusses these three approaches.

  12. Children's Lead Exposure: A Multimedia Modeling Analysis to Guide Public Health Decision-Making.

    PubMed

    Zartarian, Valerie; Xue, Jianping; Tornero-Velez, Rogelio; Brown, James

    2017-09-12

    Drinking water and other sources for lead are the subject of public health concerns around the Flint, Michigan, drinking water and East Chicago, Indiana, lead in soil crises. In 2015, the U.S. Environmental Protection Agency (EPA)'s National Drinking Water Advisory Council (NDWAC) recommended establishment of a "health-based, household action level" for lead in drinking water based on children's exposure. The primary objective was to develop a coupled exposure-dose modeling approach that can be used to determine what drinking water lead concentrations keep children's blood lead levels (BLLs) below specified values, considering exposures from water, soil, dust, food, and air. Related objectives were to evaluate the coupled model estimates using real-world blood lead data, to quantify relative contributions by the various media, and to identify key model inputs. A modeling approach using the EPA's Stochastic Human Exposure and Dose Simulation (SHEDS)-Multimedia and Integrated Exposure Uptake and Biokinetic (IEUBK) models was developed using available data. This analysis for the U.S. population of young children probabilistically simulated multimedia exposures and estimated relative contributions of media to BLLs across all population percentiles for several age groups. Modeled BLLs compared well with nationally representative BLLs (0-23% relative error). Analyses revealed relative importance of soil and dust ingestion exposure pathways and associated Pb intake rates; water ingestion was also a main pathway, especially for infants. This methodology advances scientific understanding of the relationship between lead concentrations in drinking water and BLLs in children. It can guide national health-based benchmarks for lead and related community public health decisions. https://doi.org/10.1289/EHP1605.

  13. Assessing the public health benefits of reduced ozone concentrations.

    PubMed Central

    Levy, J I; Carrothers, T J; Tuomisto, J T; Hammitt, J K; Evans, J S

    2001-01-01

    In this paper we examine scientific evidence and related uncertainties in two steps of benefit-cost analyses of ozone reduction: estimating the health improvements attributable to reductions in ozone and determining the appropriate monetary values of these improvements. Although substantial evidence exists on molecular and physiologic impacts, the evidence needed to establish concentration-response functions is somewhat limited. Furthermore, because exposure to ozone depends on factors such as air conditioning use, past epidemiologic studies may not be directly applicable in unstudied settings. To evaluate the evidence likely to contribute significantly to benefits, we focus on four health outcomes: premature mortality, chronic asthma, respiratory hospital admissions, and minor restricted activity days. We determine concentration-response functions for these health outcomes for a hypothetical case study in Houston, Texas, using probabilistic weighting reflecting our judgment of the strength of the evidence and the possibility of confounding. We make a similar presentation for valuation, where uncertainty is due primarily to the lack of willingness-to-pay data for the population affected by ozone. We estimate that the annual monetary value of health benefits from reducing ozone concentrations in Houston is approximately $10 per person per microgram per cubic meter (24-hr average) reduced (95% confidence interval, $0.70-$40). The central estimate exceeds past estimates by approximately a factor of five, driven by the inclusion of mortality. We discuss the implications of our findings for future analyses and determine areas of research that might help reduce the uncertainties in benefit estimation. PMID:11748028

  14. Learning Sparse Feature Representations using Probabilistic Quadtrees and Deep Belief Nets

    DTIC Science & Technology

    2015-04-24

    Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Learning sparse feature representations is a useful instru- ment for solving an...novel framework for the classifi cation of handwritten digits that learns sparse representations using probabilistic quadtrees and Deep Belief Nets... Learning Sparse Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Report Title Learning sparse feature representations is a useful

  15. Hierarchical Probabilistic Inference of the Color-Magnitude Diagram and Shrinkage of Stellar Distance Uncertainties

    NASA Astrophysics Data System (ADS)

    Leistedt, Boris; Hogg, David W.

    2017-12-01

    We present a hierarchical probabilistic model for improving geometric stellar distance estimates using color-magnitude information. This is achieved with a data-driven model of the color-magnitude diagram, not relying on stellar models but instead on the relative abundances of stars in color-magnitude cells, which are inferred from very noisy magnitudes and parallaxes. While the resulting noise-deconvolved color-magnitude diagram can be useful for a range of applications, we focus on deriving improved stellar distance estimates relying on both parallax and photometric information. We demonstrate the efficiency of this approach on the 1.4 million stars of the Gaia TGAS sample that also have AAVSO Photometric All Sky Survey magnitudes. Our hierarchical model has 4 million parameters in total, most of which are marginalized out numerically or analytically. We find that distance estimates are significantly improved for the noisiest parallaxes and densest regions of the color-magnitude diagram. In particular, the average distance signal-to-noise ratio (S/N) and uncertainty improve by 19% and 36%, respectively, with 8% of the objects improving in S/N by a factor greater than 2. This computationally efficient approach fully accounts for both parallax and photometric noise and is a first step toward a full hierarchical probabilistic model of the Gaia data.

  16. A Probabilistic Analysis of Surface Water Flood Risk in London.

    PubMed

    Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris

    2018-06-01

    Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.

  17. A Hybrid Probabilistic Model for Unified Collaborative and Content-Based Image Tagging.

    PubMed

    Zhou, Ning; Cheung, William K; Qiu, Guoping; Xue, Xiangyang

    2011-07-01

    The increasing availability of large quantities of user contributed images with labels has provided opportunities to develop automatic tools to tag images to facilitate image search and retrieval. In this paper, we present a novel hybrid probabilistic model (HPM) which integrates low-level image features and high-level user provided tags to automatically tag images. For images without any tags, HPM predicts new tags based solely on the low-level image features. For images with user provided tags, HPM jointly exploits both the image features and the tags in a unified probabilistic framework to recommend additional tags to label the images. The HPM framework makes use of the tag-image association matrix (TIAM). However, since the number of images is usually very large and user-provided tags are diverse, TIAM is very sparse, thus making it difficult to reliably estimate tag-to-tag co-occurrence probabilities. We developed a collaborative filtering method based on nonnegative matrix factorization (NMF) for tackling this data sparsity issue. Also, an L1 norm kernel method is used to estimate the correlations between image features and semantic concepts. The effectiveness of the proposed approach has been evaluated using three databases containing 5,000 images with 371 tags, 31,695 images with 5,587 tags, and 269,648 images with 5,018 tags, respectively.

  18. Saul: Towards Declarative Learning Based Programming

    PubMed Central

    Kordjamshidi, Parisa; Roth, Dan; Wu, Hao

    2015-01-01

    We present Saul, a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction. PMID:26635465

  19. Saul: Towards Declarative Learning Based Programming.

    PubMed

    Kordjamshidi, Parisa; Roth, Dan; Wu, Hao

    2015-07-01

    We present Saul , a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction.

  20. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

  1. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  2. Processing of probabilistic information in weight perception and motor prediction.

    PubMed

    Trampenau, Leif; van Eimeren, Thilo; Kuhtz-Buschbeck, Johann

    2017-02-01

    We studied the effects of probabilistic cues, i.e., of information of limited certainty, in the context of an action task (GL: grip-lift) and of a perceptual task (WP: weight perception). Normal subjects (n = 22) saw four different probabilistic visual cues, each of which announced the likely weight of an object. In the GL task, the object was grasped and lifted with a pinch grip, and the peak force rates indicated that the grip and load forces were scaled predictively according to the probabilistic information. The WP task provided the expected heaviness related to each probabilistic cue; the participants gradually adjusted the object's weight until its heaviness matched the expected weight for a given cue. Subjects were randomly assigned to two groups: one started with the GL task and the other one with the WP task. The four different probabilistic cues influenced weight adjustments in the WP task and peak force rates in the GL task in a similar manner. The interpretation and utilization of the probabilistic information was critically influenced by the initial task. Participants who started with the WP task classified the four probabilistic cues into four distinct categories and applied these categories to the subsequent GL task. On the other side, participants who started with the GL task applied three distinct categories to the four cues and retained this classification in the following WP task. The initial strategy, once established, determined the way how the probabilistic information was interpreted and implemented.

  3. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    PubMed Central

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  4. Probabilistic seismic hazard estimates incorporating site effects - An example from Indiana, U.S.A

    USGS Publications Warehouse

    Hasse, J.S.; Park, C.H.; Nowack, R.L.; Hill, J.R.

    2010-01-01

    The U.S. Geological Survey (USGS) has published probabilistic earthquake hazard maps for the United States based on current knowledge of past earthquake activity and geological constraints on earthquake potential. These maps for the central and eastern United States assume standard site conditions with Swave velocities of 760 m/s in the top 30 m. For urban and infrastructure planning and long-term budgeting, the public is interested in similar probabilistic seismic hazard maps that take into account near-surface geological materials. We have implemented a probabilistic method for incorporating site effects into the USGS seismic hazard analysis that takes into account the first-order effects of the surface geologic conditions. The thicknesses of sediments, which play a large role in amplification, were derived from a P-wave refraction database with over 13, 000 profiles, and a preliminary geology-based velocity model was constructed from available information on S-wave velocities. An interesting feature of the preliminary hazard maps incorporating site effects is the approximate factor of two increases in the 1-Hz spectral acceleration with 2 percent probability of exceedance in 50 years for parts of the greater Indianapolis metropolitan region and surrounding parts of central Indiana. This effect is primarily due to the relatively thick sequence of sediments infilling ancient bedrock topography that has been deposited since the Pleistocene Epoch. As expected, the Late Pleistocene and Holocene depositional systems of the Wabash and Ohio Rivers produce additional amplification in the southwestern part of Indiana. Ground motions decrease, as would be expected, toward the bedrock units in south-central Indiana, where motions are significantly lower than the values on the USGS maps.

  5. Probabilistic objective functions for margin-less IMRT planning

    NASA Astrophysics Data System (ADS)

    Bohoslavsky, Román; Witte, Marnix G.; Janssen, Tomas M.; van Herk, Marcel

    2013-06-01

    We present a method to implement probabilistic treatment planning of intensity-modulated radiation therapy using custom software plugins in a commercial treatment planning system. Our method avoids the definition of safety-margins by directly including the effect of geometrical uncertainties during optimization when objective functions are evaluated. Because the shape of the resulting dose distribution implicitly defines the robustness of the plan, the optimizer has much more flexibility than with a margin-based approach. We expect that this added flexibility helps to automatically strike a better balance between target coverage and dose reduction for surrounding healthy tissue, especially for cases where the planning target volume overlaps organs at risk. Prostate cancer treatment planning was chosen to develop our method, including a novel technique to include rotational uncertainties. Based on population statistics, translations and rotations are simulated independently following a marker-based IGRT correction strategy. The effects of random and systematic errors are incorporated by first blurring and then shifting the dose distribution with respect to the clinical target volume. For simplicity and efficiency, dose-shift invariance and a rigid-body approximation are assumed. Three prostate cases were replanned using our probabilistic objective functions. To compare clinical and probabilistic plans, an evaluation tool was used that explicitly incorporates geometric uncertainties using Monte-Carlo methods. The new plans achieved similar or better dose distributions than the original clinical plans in terms of expected target coverage and rectum wall sparing. Plan optimization times were only about a factor of two higher than in the original clinical system. In conclusion, we have developed a practical planning tool that enables margin-less probability-based treatment planning with acceptable planning times, achieving the first system that is feasible for clinical implementation.

  6. The Use of Probabilistic Methods to Evaluate the Systems Impact of Component Design Improvements on Large Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Packard, Michael H.

    2002-01-01

    Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.

  7. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.

  8. A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography

    DTIC Science & Technology

    2010-04-01

    distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by...umn.edu 2 ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in...criteria for aligning curves and particularly tracts. In this work, we present a global probabilistic approach inspired by the voting procedure provided

  9. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  10. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  11. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  12. Comparison and validation of shallow landslides susceptibility maps generated by bi-variate and multi-variate linear probabilistic GIS-based techniques. A case study from Ribeira Quente Valley (S. Miguel Island, Azores)

    NASA Astrophysics Data System (ADS)

    Marques, R.; Amaral, P.; Zêzere, J. L.; Queiroz, G.; Goulart, C.

    2009-04-01

    Slope instability research and susceptibility mapping is a fundamental component of hazard assessment and is of extreme importance for risk mitigation, land-use management and emergency planning. Landslide susceptibility zonation has been actively pursued during the last two decades and several methodologies are still being improved. Among all the methods presented in the literature, indirect quantitative probabilistic methods have been extensively used. In this work different linear probabilistic methods, both bi-variate and multi-variate (Informative Value, Fuzzy Logic, Weights of Evidence and Logistic Regression), were used for the computation of the spatial probability of landslide occurrence, using the pixel as mapping unit. The methods used are based on linear relationships between landslides and 9 considered conditioning factors (altimetry, slope angle, exposition, curvature, distance to streams, wetness index, contribution area, lithology and land-use). It was assumed that future landslides will be conditioned by the same factors as past landslides in the study area. The presented work was developed for Ribeira Quente Valley (S. Miguel Island, Azores), a study area of 9,5 km2, mainly composed of volcanic deposits (ash and pumice lapilli) produced by explosive eruptions in Furnas Volcano. This materials associated to the steepness of the slopes (38,9% of the area has slope angles higher than 35°, reaching a maximum of 87,5°), make the area very prone to landslide activity. A total of 1.495 shallow landslides were mapped (at 1:5.000 scale) and included in a GIS database. The total affected area is 401.744 m2 (4,5% of the study area). Most slope movements are translational slides frequently evolving into debris-flows. The landslides are elongated, with maximum length generally equivalent to the slope extent, and their width normally does not exceed 25 m. The failure depth rarely exceeds 1,5 m and the volume is usually smaller than 700 m3. For modelling purposes, the landslides were randomly divided in two sub-datasets: a modelling dataset with 748 events (2,2% of the study area) and a validation dataset with 747 events (2,3% of the study area). The susceptibility algorithms achieved with the different probabilistic techniques, were rated individually using success rate and prediction rate curves. The best model performance was obtained with the logistic regression, although the results from the different methods do not show significant differences neither in success nor in prediction rate curves. These evidences revealed that: (1) the modelling landslide dataset is representative of the entire landslide population characteristics; and (2) the increase of complexity and robustness in the probabilistic methodology did not produce a significant increase in success or prediction rates. Therefore, it was concluded that the resolution and quality of the input variables are much more important than the probabilistic model chosen to assess landslide susceptibility. This work was developed on the behalf of VOLCSOILRISK project (Volcanic Soils Geotechnical Characterization for Landslide Risk Mitigation), supported by Direcção Regional da Ciência e Tecnologia - Governo Regional dos Açores.

  13. Space Radiation and Human Exposures, A Primer.

    PubMed

    Nelson, Gregory A

    2016-04-01

    The space radiation environment is a complex field comprised primarily of charged particles spanning energies over many orders of magnitude. The principal sources of these particles are galactic cosmic rays, the Sun and the trapped radiation belts around the earth. Superimposed on a steady influx of cosmic rays and a steady outward flux of low-energy solar wind are short-term ejections of higher energy particles from the Sun and an 11-year variation of solar luminosity that modulates cosmic ray intensity. Human health risks are estimated from models of the radiation environment for various mission scenarios, the shielding of associated vehicles and the human body itself. Transport models are used to propagate the ambient radiation fields through realistic shielding levels and materials to yield radiation field models inside spacecraft. Then, informed by radiobiological experiments and epidemiology studies, estimates are made for various outcome measures associated with impairments of biological processes, losses of function or mortality. Cancer-associated risks have been formulated in a probabilistic model while management of non-cancer risks are based on permissible exposure limits. This article focuses on the various components of the space radiation environment and the human exposures that it creates.

  14. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  15. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Abstract Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  16. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  17. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model

    PubMed Central

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; Di, Zengru

    2016-01-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries. PMID:27597319

  18. Probabilistic reasoning under time pressure: an assessment in Italian, Spanish and English psychology undergraduates

    NASA Astrophysics Data System (ADS)

    Agus, M.; Hitchcott, P. K.; Penna, M. P.; Peró-Cebollero, M.; Guàrdia-Olmos, J.

    2016-11-01

    Many studies have investigated the features of probabilistic reasoning developed in relation to different formats of problem presentation, showing that it is affected by various individual and contextual factors. Incomplete understanding of the identity and role of these factors may explain the inconsistent evidence concerning the effect of problem presentation format. Thus, superior performance has sometimes been observed for graphically, rather than verbally, presented problems. The present study was undertaken to address this issue. Psychology undergraduates without any statistical expertise (N = 173 in Italy; N = 118 in Spain; N = 55 in England) were administered statistical problems in two formats (verbal-numerical and graphical-pictorial) under a condition of time pressure. Students also completed additional measures indexing several potentially relevant individual dimensions (statistical ability, statistical anxiety, attitudes towards statistics and confidence). Interestingly, a facilitatory effect of graphical presentation was observed in the Italian and Spanish samples but not in the English one. Significantly, the individual dimensions predicting statistical performance also differed between the samples, highlighting a different role of confidence. Hence, these findings confirm previous observations concerning problem presentation format while simultaneously highlighting the importance of individual dimensions.

  19. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model

    NASA Astrophysics Data System (ADS)

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; di, Zengru

    2016-09-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries.

  20. International Space Station End-of-Life Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Duncan, Gary

    2014-01-01

    Although there are ongoing efforts to extend the ISS life cycle through 2028, the International Space Station (ISS) end-of-life (EOL) cycle is currently scheduled for 2020. The EOL for the ISS will require de-orbiting the ISS. This will be the largest manmade object ever to be de-orbited, therefore safely de-orbiting the station will be a very complex problem. This process is being planned by NASA and its international partners. Numerous factors will need to be considered to accomplish this such as target corridors, orbits, altitude, drag, maneuvering capabilities, debris mapping etc. The ISS EOL Probabilistic Risk Assessment (PRA) will play a part in this process by estimating the reliability of the hardware supplying the maneuvering capabilities. The PRA will model the probability of failure of the systems supplying and controlling the thrust needed to aid in the de-orbit maneuvering.

Top