A Medical Interviewing Curriculum Intervention for Medical Students' Assessment of Suicide Risk
ERIC Educational Resources Information Center
Fiedorowicz, Jess G.; Tate, Jodi; Miller, Anthony C.; Franklin, Ellen M.; Gourley, Ryan; Rosenbaum, Marcy
2013-01-01
Objective: Effective communication strategies are required to assess suicide risk. The authors determined whether a 2-hour simulated-patient activity during a psychiatry clerkship improved self-assessment of medical interviewing skills relevant to suicide risk-assessment. Methods: In the 2-hour simulated-patient intervention, at least one…
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
Risk assessment predictions of open dumping area after closure using Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Pauzi, Nur Irfah Mohd; Radhi, Mohd Shahril Mat; Omar, Husaini
2017-10-01
Currently, there are many abandoned open dumping areas that were left without any proper mitigation measures. These open dumping areas could pose serious hazard to human and pollute the environment. The objective of this paper is to determine the risk assessment at the open dumping area after they has been closed using Monte Carlo Simulation method. The risk assessment exercise is conducted at the Kuala Lumpur dumping area. The rapid urbanisation of Kuala Lumpur coupled with increase in population lead to increase in waste generation. It leads to more dumping/landfill area in Kuala Lumpur. The first stage of this study involve the assessment of the dumping area and samples collections. It followed by measurement of settlement of dumping area using oedometer. The risk of the settlement is predicted using Monte Carlo simulation method. Monte Carlo simulation calculates the risk and the long-term settlement. The model simulation result shows that risk level of the Kuala Lumpur open dumping area ranges between Level III to Level IV i.e. between medium risk to high risk. These settlement (ΔH) is between 3 meters to 7 meters. Since the risk is between medium to high, it requires mitigation measures such as replacing the top waste soil with new sandy gravel soil. This will increase the strength of the soil and reduce the settlement.
Risk Assessment of Carbon Sequestration into A Naturally Fractured Reservoir at Kevin Dome, Montana
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Minh; Onishi, Tsubasa; Carey, James William
In this report, we describe risk assessment work done using the National Risk Assessment Partnership (NRAP) applied to CO 2 storage at Kevin Dome, Montana. Geologic CO 2 sequestration in saline aquifers poses certain risks including CO 2/brine leakage through wells or non-sealing faults into groundwater or to the land surface. These risks are difficult to quantify due to data availability and uncertainty. One solution is to explore the consequences of these limitations by running large numbers of numerical simulations on the primary CO2 injection reservoir, shallow reservoirs/aquifers, faults, and wells to assess leakage risks and uncertainties. However, a largemore » number of full-physics simulations is usually too computationally expensive. The NRAP integrated assessment model (NRAP-IAM) uses reduced order models (ROMs) developed from full-physics simulations to address this issue. A powerful stochastic framework allows NRAPIAM to explore complex interactions among many uncertain variables and evaluate the likely performance of potential sequestration sites.« less
Matthew P. Thompson; Julie W. Gilbertson-Day; Joe H. Scott
2015-01-01
We develop a novel risk assessment approach that integrates complementary, yet distinct, spatial modeling approaches currently used in wildfire risk assessment. Motivation for this work stems largely from limitations of existing stochastic wildfire simulation systems, which can generate pixel-based outputs of fire behavior as well as polygon-based outputs of simulated...
2013-01-01
Background The validity of studies describing clinicians’ judgements based on their responses to paper cases is questionable, because - commonly used - paper case simulations only partly reflect real clinical environments. In this study we test whether paper case simulations evoke similar risk assessment judgements to the more realistic simulated patients used in high fidelity physical simulations. Methods 97 nurses (34 experienced nurses and 63 student nurses) made dichotomous assessments of risk of acute deterioration on the same 25 simulated scenarios in both paper case and physical simulation settings. Scenarios were generated from real patient cases. Measures of judgement ‘ecology’ were derived from the same case records. The relationship between nurses’ judgements, actual patient outcomes (i.e. ecological criteria), and patient characteristics were described using the methodology of judgement analysis. Logistic regression models were constructed to calculate Lens Model Equation parameters. Parameters were then compared between the modeled paper-case and physical-simulation judgements. Results Participants had significantly less achievement (ra) judging physical simulations than when judging paper cases. They used less modelable knowledge (G) with physical simulations than with paper cases, while retaining similar cognitive control and consistency on repeated patients. Respiration rate, the most important cue for predicting patient risk in the ecological model, was weighted most heavily by participants. Conclusions To the extent that accuracy in judgement analysis studies is a function of task representativeness, improving task representativeness via high fidelity physical simulations resulted in lower judgement performance in risk assessments amongst nurses when compared to paper case simulations. Lens Model statistics could prove useful when comparing different options for the design of simulations used in clinical judgement analysis. The approach outlined may be of value to those designing and evaluating clinical simulations as part of education and training strategies aimed at improving clinical judgement and reasoning. PMID:23718556
Kleinmann, Joachim U; Wang, Magnus
2017-09-01
Spatial behavior is of crucial importance for the risk assessment of pesticides and for the assessment of effects of agricultural practice or multiple stressors, because it determines field use, exposition, and recovery. Recently, population models have increasingly been used to understand the mechanisms driving risk and recovery or to conduct landscape-level risk assessments. To include spatial behavior appropriately in population models for use in risk assessments, a new method, "probabilistic walk," was developed, which simulates the detailed daily movement of individuals by taking into account food resources, vegetation cover, and the presence of conspecifics. At each movement step, animals decide where to move next based on probabilities being determined from this information. The model was parameterized to simulate populations of brown hares (Lepus europaeus). A detailed validation of the model demonstrated that it can realistically reproduce various natural patterns of brown hare ecology and behavior. Simulated proportions of time animals spent in fields (PT values) were also comparable to field observations. It is shown that these important parameters for the risk assessment may, however, vary in different landscapes. The results demonstrate the value of using population models to reduce uncertainties in risk assessment and to better understand which factors determine risk in a landscape context. Environ Toxicol Chem 2017;36:2299-2307. © 2017 SETAC. © 2017 SETAC.
USDA-ARS?s Scientific Manuscript database
Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...
Falls Risk and Simulated Driving Performance in Older Adults
Gaspar, John G.; Neider, Mark B.; Kramer, Arthur F.
2013-01-01
Declines in executive function and dual-task performance have been related to falls in older adults, and recent research suggests that older adults at risk for falls also show impairments on real-world tasks, such as crossing a street. The present study examined whether falls risk was associated with driving performance in a high-fidelity simulator. Participants were classified as high or low falls risk using the Physiological Profile Assessment and completed a number of challenging simulated driving assessments in which they responded quickly to unexpected events. High falls risk drivers had slower response times (~2.1 seconds) to unexpected events compared to low falls risk drivers (~1.7 seconds). Furthermore, when asked to perform a concurrent cognitive task while driving, high falls risk drivers showed greater costs to secondary task performance than did low falls risk drivers, and low falls risk older adults also outperformed high falls risk older adults on a computer-based measure of dual-task performance. Our results suggest that attentional differences between high and low falls risk older adults extend to simulated driving performance. PMID:23509627
Assessing risk-adjustment approaches under non-random selection.
Luft, Harold S; Dudley, R Adams
2004-01-01
Various approaches have been proposed to adjust for differences in enrollee risk in health plans. Because risk-selection strategies may have different effects on enrollment, we simulated three types of selection--dumping, skimming, and stinting. Concurrent diagnosis-based risk adjustment, and a hybrid using concurrent adjustment for about 8% of the cases and prospective adjustment for the rest, perform markedly better than prospective or demographic adjustments, both in terms of R2 and the extent to which plans experience unwarranted gains or losses. The simulation approach offers a valuable tool for analysts in assessing various risk-adjustment strategies under different selection situations.
Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A
2005-06-01
A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCray, John
Capturing carbon dioxide (CO2) and injecting it into deep underground formations for storage (carbon capture and underground storage, or CCUS) is one way of reducing anthropogenic CO2 emissions. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Understanding the mechanisms and parameters that can contribute to leakage of the CO2 and the ultimate impact on shallow water aquifers that overliemore » injection formations is an important step in evaluating the efficacy and risks associated with long-term CO2 storage. Three students were supported on the grant Training Graduate and Undergraduate Students in Simulation and Risk Assessment for Carbon Sequestration. These three students each examined a different aspect of simulation and risk assessment related to carbon dioxide sequestration and the potential impacts of CO2 leakage. Two performed numerical simulation studies, one to assess leakage rates as a function of fault and deep reservoir parameters and one to develop a method for quantitative risk assessment in the event of a CO2 leak and subsequent changes in groundwater chemistry. A third student performed an experimental evaluation of the potential for metal release from sandstone aquifers under simulated leakage conditions. This study has resulted in two student first-authored published papers {Siirila, 2012 #560}{Kirsch, 2014 #770} and one currently in preparation {Menke, In prep. #809}.« less
Di Domenico, Julia; Vaz, Carlos André; de Souza, Maurício Bezerra
2014-06-15
The use of process simulators can contribute with quantitative risk assessment (QRA) by minimizing expert time and large volume of data, being mandatory in the case of a future plant. This work illustrates the advantages of this association by integrating UNISIM DESIGN simulation and QRA to investigate the acceptability of a new technology of a Methanol Production Plant in a region. The simulated process was based on the hydrogenation of chemically sequestered carbon dioxide, demanding stringent operational conditions (high pressures and temperatures) and involving the production of hazardous materials. The estimation of the consequences was performed using the PHAST software, version 6.51. QRA results were expressed in terms of individual and social risks. Compared to existing tolerance levels, the risks were considered tolerable in nominal conditions of operation of the plant. The use of the simulator in association with the QRA also allowed testing the risk in new operating conditions in order to delimit safe regions for the plant. Copyright © 2014 Elsevier B.V. All rights reserved.
Risk-Assessment for Equipment Operating on the Lunar Surface
NASA Technical Reports Server (NTRS)
Richmond, R. C.; Kusiak, A.; Ramachandran, N.
2008-01-01
Particle-size distribution of lunar dust simulant is evaluated using scanning electron spectroscopy in order to consider approaches to evaluating risk to individual mechanical components operating on the lunar surface. Assessing component risk and risk-mitigation during actual operations will require noninvasive continuous data gathering on numerous parameters. Those data sets would best be evaluated using data-mining algorithms to assess risk, and recovery from risk, of individual mechanical components in real-time.
NASA Astrophysics Data System (ADS)
Cheng, T.; Xu, Z.; Hong, S.
2017-12-01
Flood disasters frequently attack the urban area in Jinan City during past years, and the city is faced with severe road flooding which greatly threaten pedestrians' safety. Therefore, it is of great significance to investigate the pedestrian risk during floods under specific topographic condition. In this study, a model coupled hydrological and hydrodynamic processes is developed in the study area to simulate the flood routing process on the road for the "7.18" rainstorm and validated with post-disaster damage survey information. The risk of pedestrian is estimated with a flood risk assessment model. The result shows that the coupled model performs well in the rainstorm flood process. On the basis of the simulation result, the areas with extreme risk, medium risk, and mild risk are identified, respectively. Regions with high risk are generally located near the mountain front area with steep slopes. This study will provide scientific support for the flood control and disaster reduction in Jinan City.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutierrez, Marte
Colorado School of Mines conducted research and training in the development and validation of an advanced CO{sub 2} GS (Geological Sequestration) probabilistic simulation and risk assessment model. CO{sub 2} GS simulation and risk assessment is used to develop advanced numerical simulation models of the subsurface to forecast CO2 behavior and transport; optimize site operational practices; ensure site safety; and refine site monitoring, verification, and accounting efforts. As simulation models are refined with new data, the uncertainty surrounding the identified risks decrease, thereby providing more accurate risk assessment. The models considered the full coupling of multiple physical processes (geomechanical and fluidmore » flow) and describe the effects of stochastic hydro-mechanical (H-M) parameters on the modeling of CO{sub 2} flow and transport in fractured porous rocks. Graduate students were involved in the development and validation of the model that can be used to predict the fate, movement, and storage of CO{sub 2} in subsurface formations, and to evaluate the risk of potential leakage to the atmosphere and underground aquifers. The main major contributions from the project include the development of: 1) an improved procedure to rigorously couple the simulations of hydro-thermomechanical (H-M) processes involved in CO{sub 2} GS; 2) models for the hydro-mechanical behavior of fractured porous rocks with random fracture patterns; and 3) probabilistic methods to account for the effects of stochastic fluid flow and geomechanical properties on flow, transport, storage and leakage associated with CO{sub 2} GS. The research project provided the means to educate and train graduate students in the science and technology of CO{sub 2} GS, with a focus on geologic storage. Specifically, the training included the investigation of an advanced CO{sub 2} GS simulation and risk assessment model that can be used to predict the fate, movement, and storage of CO{sub 2} in underground formations, and the evaluation of the risk of potential CO{sub 2} leakage to the atmosphere and underground aquifers.« less
NASA Astrophysics Data System (ADS)
Rumore, D.; Kirshen, P. H.; Susskind, L.
2014-12-01
Despite scientific consensus that the climate is changing, local efforts to prepare for and manage climate change risks remain limited. How we can raise concern about climate change risks and enhance local readiness to adapt to climate change's effects? In this presentation, we will share the lessons learned from the New England Climate Adaptation Project (NECAP), a participatory action research project that tested science-based role-play simulations as a tool for educating the public about climate change risks and simulating collective risk management efforts. NECAP was a 2-year effort involving the Massachusetts Institute of Technology, the Consensus Building Institute, the National Estuarine Research Reserve System, and four coastal New England municipalities. During 2012-2013, the NECAP team produced downscaled climate change projections, a summary risk assessment, and a stakeholder assessment for each partner community. Working with local partners, we used these assessments to create a tailored, science-based role-play simulation for each site. Through a series of workshops in 2013, NECAP engaged between 115-170 diverse stakeholders and members of the public in each partner municipality in playing the simulation and a follow up conversation about local climate change risks and possible adaptation strategies. Data were collected through before-and-after surveys administered to all workshop participants, follow-up interviews with 25 percent of workshop participants, public opinion polls conducted before and after our intervention, and meetings with public officials. This presentation will report our research findings and explain how science-based role-play simulations can be used to help communicate local climate change risks and enhance local readiness to adapt.
SIMULATING INTEGRATED MULTIMEDIA CHEMICAL FATE AND TRANSPORT FOR NATIONAL RISK ASSESSMENTS
The site-based multimedia, multipathway and multireceptor risk assessment (3MRA) approach is comprised of source, fate and transport, exposure and risk modules. The main interconnected multimedia fate and transport modules are: watershed, air, surface water, vadose zone and sat...
Liebl, Hans; Garcia, Eduardo Grande; Holzner, Fabian; Noel, Peter B.; Burgkart, Rainer; Rummeny, Ernst J.; Baum, Thomas; Bauer, Jan S.
2015-01-01
Purpose To experimentally validate a non-linear finite element analysis (FEA) modeling approach assessing in-vitro fracture risk at the proximal femur and to transfer the method to standard in-vivo multi-detector computed tomography (MDCT) data of the hip aiming to predict additional hip fracture risk in subjects with and without osteoporosis associated vertebral fractures using bone mineral density (BMD) measurements as gold standard. Methods One fresh-frozen human femur specimen was mechanically tested and fractured simulating stance and clinically relevant fall loading configurations to the hip. After experimental in-vitro validation, the FEA simulation protocol was transferred to standard contrast-enhanced in-vivo MDCT images to calculate individual hip fracture risk each for 4 subjects with and without a history of osteoporotic vertebral fractures matched by age and gender. In addition, FEA based risk factor calculations were compared to manual femoral BMD measurements of all subjects. Results In-vitro simulations showed good correlation with the experimentally measured strains both in stance (R2 = 0.963) and fall configuration (R2 = 0.976). The simulated maximum stress overestimated the experimental failure load (4743 N) by 14.7% (5440 N) while the simulated maximum strain overestimated by 4.7% (4968 N). The simulated failed elements coincided precisely with the experimentally determined fracture locations. BMD measurements in subjects with a history of osteoporotic vertebral fractures did not differ significantly from subjects without fragility fractures (femoral head: p = 0.989; femoral neck: p = 0.366), but showed higher FEA based risk factors for additional incident hip fractures (p = 0.028). Conclusion FEA simulations were successfully validated by elastic and destructive in-vitro experiments. In the subsequent in-vivo analyses, MDCT based FEA based risk factor differences for additional hip fractures were not mirrored by according BMD measurements. Our data suggests, that MDCT derived FEA models may assess bone strength more accurately than BMD measurements alone, providing a valuable in-vivo fracture risk assessment tool. PMID:25723187
Simulation Modeling of Resilience Assessment in Indonesian Fertiliser Industry Supply Networks
NASA Astrophysics Data System (ADS)
Utami, I. D.; Holt, R. J.; McKay, A.
2018-01-01
Supply network resilience is a significant aspect in the performance of the Indonesian fertiliser industry. Decision makers use risk assessment and port management reports to evaluate the availability of infrastructure. An opportunity was identified to incorporate both types of data into an approach for the measurement of resilience. A framework, based on a synthesis of literature and interviews with industry practitioners, covering both social and technical factors is introduced. A simulation model was then built to allow managers to explore implications for resilience and predict levels of risk in different scenarios. Result of interview with respondens from Indonesian fertiliser industry indicated that the simulation model could be valuable in the assessment. This paper provides details of the simulation model for decision makers to explore levels of risk in supply networks. For practitioners, the model could be used by government to assess the current condition of supply networks in Indonesian industries. On the other hand, for academia, the approach provides a new application of agent-based models in research on supply network resilience and presents a real example of how agent-based modeling could be used as to support the assessment approach.
Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel
2016-01-01
Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the "classical" risk assessment approach with the model-based approach. These comparisons showed that TK and TK-TD models can bring more realism to the risk assessment through the possibility to study realistic exposure scenarios and to simulate relevant mechanisms of effects (including delayed toxicity and recovery). Noticeably, using TK-TD models is currently the most relevant way to directly connect realistic exposure patterns to effects. We conclude with recommendations on how to properly use TK and TK-TD model in acute risk assessment for vertebrates. © 2015 SETAC.
Jing, Xu; Hu, Hanwen; Yang, Huijun; Au, Man Ho; Li, Shuqin; Xiong, Naixue; Imran, Muhammad; Vasilakos, Athanasios V
2017-03-21
The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider's server contains a lot of valuable resources. LoBSs' users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs' risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs' risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing.
Jing, Xu; Hu, Hanwen; Yang, Huijun; Au, Man Ho; Li, Shuqin; Xiong, Naixue; Imran, Muhammad; Vasilakos, Athanasios V.
2017-01-01
The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider’s server contains a lot of valuable resources. LoBSs’ users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs’ risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs’ risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing. PMID:28335569
Rechmann, Peter; Featherstone, John D B
2014-09-01
The goal of this quality assurance study was to explore the decision making of clinical faculty members at the University of California, San Francisco School of Dentistry predoctoral dental clinic in terms of caries risk level assignment using the caries risk assessment (CRA) as part of the Caries Management by Risk Assessment (CAMBRA) concept. This research was done in part to determine if additional training and calibration were needed for these faculty members. The study tested the reliability and reproducibility of the caries risk levels assigned by different clinical teachers who completed CRA forms for simulated patients. In the first step, five clinical teachers assigned caries risk levels for thirteen simulated patients. Six months later, the same five plus an additional nine faculty members assigned caries risk levels to the same thirteen simulated and nine additional cases. While the intra-examiner reliability with weighted kappa strength of agreement was very high, the inter-examiner agreements with a gold standard were on average only moderate. In total, 20 percent of the presented high caries risk cases were underestimated at caries levels too low, even when obvious caries disease indicators were present. This study suggests that more consistent training and calibration of clinical faculty members as well as students are needed.
NASA Astrophysics Data System (ADS)
Mishra, H.; Karmakar, S.; Kumar, R.
2016-12-01
Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
The Analysis of Rush Orders Risk in Supply Chain: A Simulation Approach
NASA Technical Reports Server (NTRS)
Mahfouz, Amr; Arisha, Amr
2011-01-01
Satisfying customers by delivering demands at agreed time, with competitive prices, and in satisfactory quality level are crucial requirements for supply chain survival. Incidence of risks in supply chain often causes sudden disruptions in the processes and consequently leads to customers losing their trust in a company's competence. Rush orders are considered to be one of the main types of supply chain risks due to their negative impact on the overall performance, Using integrated definition modeling approaches (i.e. IDEF0 & IDEF3) and simulation modeling technique, a comprehensive integrated model has been developed to assess rush order risks and examine two risk mitigation strategies. Detailed functions sequence and objects flow were conceptually modeled to reflect on macro and micro levels of the studied supply chain. Discrete event simulation models were then developed to assess and investigate the mitigation strategies of rush order risks, the objective of this is to minimize order cycle time and cost.
A simulation of probabilistic wildfire risk components for the continental United States
Mark A. Finney; Charles W. McHugh; Isaac C. Grenfell; Karin L. Riley; Karen C. Short
2011-01-01
This simulation research was conducted in order to develop a large-fire risk assessment system for the contiguous land area of the United States. The modeling system was applied to each of 134 Fire Planning Units (FPUs) to estimate burn probabilities and fire size distributions. To obtain stable estimates of these quantities, fire ignition and growth was simulated for...
ERIC Educational Resources Information Center
Smith, Douglas L.
1997-01-01
Describes a model for team developmental assessment of high-risk infants using a fiber-optic "distance learning" televideo network in south-central New York. An arena style transdisciplinary play-based assessment model was adapted for use across the televideo connection and close simulation of convention assessment procedures was…
NASA Astrophysics Data System (ADS)
Mokhov, I. I.
2018-04-01
The results describing the ability of contemporary global and regional climate models not only to assess the risk of general trends of changes but also to predict qualitatively new regional effects are presented. In particular, model simulations predicted spatially inhomogeneous changes in the wind and wave conditions in the Arctic basins, which have been confirmed in recent years. According to satellite and reanalysis data, a qualitative transition to the regime predicted by model simulations occurred about a decade ago.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha
Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.
NASA Astrophysics Data System (ADS)
van Ginneken, Meike; Oron, Gideon
2000-09-01
This study assesses health risks to consumers due to the use of agricultural products irrigated with reclaimed wastewater. The analysis is based on a definition of an exposure model which takes into account several parameters: (1) the quality of the applied wastewater, (2) the irrigation method, (3) the elapsed times between irrigation, harvest, and product consumption, and (4) the consumers' habits. The exposure model is used for numerical simulation of human consumers' risks using the Monte Carlo simulation method. The results of the numerical simulation show large deviations, probably caused by uncertainty (impreciseness in quality of input data) and variability due to diversity among populations. There is a 10-orders of magnitude difference in the risk of infection between the different exposure scenarios with the same water quality. This variation indicates the need for setting risk-based criteria for wastewater reclamation rather than single water quality guidelines. Extra data are required to decrease uncertainty in the risk assessment. Future research needs to include definition of acceptable risk criteria, more accurate dose-response modeling, information regarding pathogen survival in treated wastewater, additional data related to the passage of pathogens into and in the plants during irrigation, and information regarding the behavior patterns of the community of human consumers.
BNSF San Bernardino case study : positive train control risk assessment.
DOT National Transportation Integrated Search
2014-09-01
The Federal Railroad Administration funded the BNSF San Bernardino Case Study to verify its Generalized Train Movement : Simulator (GTMS) risk assessment capabilities on a planned implementation of the I-ETMS PTC system. The analysis explicitly : sim...
A simulation model for risk assessment of turbine wheels
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Hage, Richard T.
1991-01-01
A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.
A simulation model for risk assessment of turbine wheels
NASA Astrophysics Data System (ADS)
Safie, Fayssal M.; Hage, Richard T.
A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.
Perceived breast cancer risk: heuristic reasoning and search for a dominance structure.
Katapodi, Maria C; Facione, Noreen C; Humphreys, Janice C; Dodd, Marylin J
2005-01-01
Studies suggest that people construct their risk perceptions by using inferential rules called heuristics. The purpose of this study was to identify heuristics that influence perceived breast cancer risk. We examined 11 interviews from women of diverse ethnic/cultural backgrounds who were recruited from community settings. Narratives in which women elaborated about their own breast cancer risk were analyzed with Argument and Heuristic Reasoning Analysis methodology, which is based on applied logic. The availability, simulation, representativeness, affect, and perceived control heuristics, and search for a dominance structure were commonly used for making risk assessments. Risk assessments were based on experiences with an abnormal breast symptom, experiences with affected family members and friends, beliefs about living a healthy lifestyle, and trust in health providers. Assessment of the potential threat of a breast symptom was facilitated by the search for a dominance structure. Experiences with family members and friends were incorporated into risk assessments through the availability, simulation, representativeness, and affect heuristics. Mistrust in health providers led to an inappropriate dependence on the perceived control heuristic. Identified heuristics appear to create predictable biases and suggest that perceived breast cancer risk is based on common cognitive patterns.
Chauvin, Anthony; Truchot, Jennifer; Bafeta, Aida; Pateron, Dominique; Plaisance, Patrick; Yordanov, Youri
2018-04-01
The number of trials assessing Simulation-Based Medical Education (SBME) interventions has rapidly expanded. Many studies show that potential flaws in design, conduct and reporting of randomized controlled trials (RCTs) can bias their results. We conducted a methodological review of RCTs assessing a SBME in Emergency Medicine (EM) and examined their methodological characteristics. We searched MEDLINE via PubMed for RCT that assessed a simulation intervention in EM, published in 6 general and internal medicine and in the top 10 EM journals. The Cochrane Collaboration risk of Bias tool was used to assess risk of bias, intervention reporting was evaluated based on the "template for intervention description and replication" checklist, and methodological quality was evaluated by the Medical Education Research Study Quality Instrument. Reports selection and data extraction was done by 2 independents researchers. From 1394 RCTs screened, 68 trials assessed a SBME intervention. They represent one quarter of our sample. Cardiopulmonary resuscitation (CPR) is the most frequent topic (81%). Random sequence generation and allocation concealment were performed correctly in 66 and 49% of trials. Blinding of participants and assessors was performed correctly in 19 and 68%. Risk of attrition bias was low in three-quarters of the studies (n = 51). Risk of selective reporting bias was unclear in nearly all studies. The mean MERQSI score was of 13.4/18.4% of the reports provided a description allowing the intervention replication. Trials assessing simulation represent one quarter of RCTs in EM. Their quality remains unclear, and reproducing the interventions appears challenging due to reporting issues.
Jeffrey G. Borchers
2005-01-01
The risks, uncertainties, and social conflicts surrounding uncharacteristic wildfire and forest resource values have defied conventional approaches to planning and decision-making. Paradoxically, the adoption of technological innovations such as risk assessment, decision analysis, and landscape simulation models by land management organizations has been limited. The...
The useful field of view assessment predicts simulated commercial motor vehicle driving safety.
McManus, Benjamin; Heaton, Karen; Vance, David E; Stavrinos, Despina
2016-10-02
The Useful Field of View (UFOV) assessment, a measure of visual speed of processing, has been shown to be a predictive measure of motor vehicle collision (MVC) involvement in an older adult population, but it remains unknown whether UFOV predicts commercial motor vehicle (CMV) driving safety during secondary task engagement. The purpose of this study is to determine whether the UFOV assessment predicts simulated MVCs in long-haul CMV drivers. Fifty licensed CMV drivers (Mage = 39.80, SD = 8.38, 98% male, 56% Caucasian) were administered the 3-subtest version of the UFOV assessment, where lower scores measured in milliseconds indicated better performance. CMV drivers completed 4 simulated drives, each spanning approximately a 22.50-mile distance. Four secondary tasks were presented to participants in a counterbalanced order during the drives: (a) no secondary task, (b) cell phone conversation, (c) text messaging interaction, and (d) e-mailing interaction with an on-board dispatch device. The selective attention subtest significantly predicted simulated MVCs regardless of secondary task. Each 20 ms slower on subtest 3 was associated with a 25% increase in the risk of an MVC in the simulated drive. The e-mail interaction secondary task significantly predicted simulated MVCs with a 4.14 times greater risk of an MVC compared to the no secondary task condition. Subtest 3, a measure of visual speed of processing, significantly predicted MVCs in the email interaction task. Each 20 ms slower on subtest 3 was associated with a 25% increase in the risk of an MVC during the email interaction task. The UFOV subtest 3 may be a promising measure to identify CMV drivers who may be at risk for MVCs or in need of cognitive training aimed at improving speed of processing. Subtest 3 may also identify CMV drivers who are particularly at risk when engaged in secondary tasks while driving.
Liao, Zhi-Heng; Sun, Jia-Ren; Wu, Dui; Fan, Shao-Jia; Ren, Ming-Zhong; Lü, Jia-Yang
2014-06-01
The CALPUFF model was applied to simulate the ground-level atmospheric concentrations of Pb and Cd from municipal solid waste incineration (MSWI) plants, and the soil concentration model was used to estimate soil concentration increments after atmospheric deposition based on Monte Carlo simulation, then ecological risk assessment was conducted by the potential ecological risk index method. The results showed that the largest atmospheric concentrations of Pb and Cd were 5.59 x 109-3) microg x m(-3) and 5.57 x 10(-4) microg x m(-3), respectively, while the maxima of soil concentration incremental medium of Pb and Cd were 2.26 mg x kg(-1) and 0.21 mg x kg(-1), respectively; High risk areas were located next to the incinerators, Cd contributed the most to the ecological risk, and Pb was basically free of pollution risk; Higher ecological hazard level was predicted at the most polluted point in urban areas with a 55.30% probability, while in rural areas, the most polluted point was assessed to moderate ecological hazard level with a 72.92% probability. In addition, sensitivity analysis of calculation parameters in the soil concentration model was conducted, which showed the simulated results of urban and rural area were most sensitive to soil mix depth and dry deposition rate, respectively.
Signorini, M L; Marín, V; Quinteros, C; Tarabla, H
2009-01-01
A quantitative risk assessment was developed for verocytotoxigenic Escherichia coli (VTEC) associated with hamburger consumption. The assessment (simulation model) considers the distribution, storage and consumption patterns of hamburgers. The prevalence and concentration of VTEC were modelled at various stages along the agri-food beef production system using input derived from Argentinean data, whenever possible. The model predicted an infection risk of 4.45 x 10(-4) per meal for adults. The risk values obtained for children were 2.6 x 10(-4), 1.38 x 10(-5) and 4.54 x 10(-7) for infection, Hemolytic Uremic Syndrome (HUS) and mortality, respectively. The risk of infection and HUS was positively correlated with bacterial concentration in meat (r = 0.664). There was a negative association between homemade hamburgers (r = -0.116) and the risk of illness; however this association has been considered due to differences between retail and domiciliary storage systems (r = -0.567) and not because of the intrinsic characteristics of the product. The most sensitive points of the production system were identified through the risk assessment, therefore, these can be utilized as a basis to apply different risk management policies in public health.
Corrias, A.; Jie, X.; Romero, L.; Bishop, M. J.; Bernabeu, M.; Pueyo, E.; Rodriguez, B.
2010-01-01
In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of significant animal species differences in drug-induced effects on cardiac electrophysiology. Predicting drug-induced pro-arrhythmic risk solely using experiments is challenging both preclinically and clinically, as attested by the rise in the cost of releasing new compounds to the market. Computational modelling and simulation has significantly contributed to the understanding of cardiac electrophysiology and arrhythmias over the last 40 years. In the second part of this paper, we illustrate how state-of-the-art open source computational modelling and simulation tools can be used to simulate multi-scale effects of drug-induced ion channel block in ventricular electrophysiology at the cellular, tissue and whole ventricular levels for different animal species. We believe that the use of computational modelling and simulation in combination with experimental techniques could be a powerful tool for the assessment of drug safety pharmacology. PMID:20478918
Falk, L E; Fader, K A; Cui, D S; Totton, S C; Fazil, A M; Lammerding, A M; Smith, B A
2016-10-01
Although infection by the pathogenic bacterium Listeria monocytogenes is relatively rare, consequences can be severe, with a high case-fatality rate in vulnerable populations. A quantitative, probabilistic risk assessment tool was developed to compare estimates of the number of invasive listeriosis cases in vulnerable Canadian subpopulations given consumption of contaminated ready-to-eat delicatessen meats and hot dogs, under various user-defined scenarios. The model incorporates variability and uncertainty through Monte Carlo simulation. Processes considered within the model include cross-contamination, growth, risk factor prevalence, subpopulation susceptibilities, and thermal inactivation. Hypothetical contamination events were simulated. Results demonstrated varying risk depending on the consumer risk factors and implicated product (turkey delicatessen meat without growth inhibitors ranked highest for this scenario). The majority (80%) of listeriosis cases were predicted in at-risk subpopulations comprising only 20% of the total Canadian population, with the greatest number of predicted cases in the subpopulation with dialysis and/or liver disease. This tool can be used to simulate conditions and outcomes under different scenarios, such as a contamination event and/or outbreak, to inform public health interventions.
Multimedia risk assessments require the temporal integration of atmospheric concentration and deposition with other media modules. However, providing an extended time series of estimates is computationally expensive. An alternative approach is to substitute long-term average a...
Kobayashi, Leo; Green, Traci C.; Bowman, Sarah E.; Ray, Madeline C.; McKenzie, Michelle S.; Rich, Josiah D.
2016-01-01
Introduction Investigators applied simulation to an experimental program that educated, trained and assessed at-risk, volunteering prisoners on opioid overdose (OD) prevention, recognition and layperson management with intranasal (IN) naloxone. Methods Consenting inmates were assessed for OD-related experience and knowledge then exposed on-site to standardized didactics and educational DVD (without simulation). Subjects were provided with IN naloxone kits at time of release and scheduled for post-release assessment. At follow-up, subjects were evaluated for their performance of layperson opioid OD resuscitative skills during video-recorded simulations. Two investigators independently scored each subject’s resuscitative actions with a 21-item checklist; post-hoc video reviews were separately completed to adjudicate subjects’ interactions for overall benefit or harm. Results One hundred and three prisoners completed the baseline assessment and study intervention then were prescribed IN naloxone kits. One-month follow-up and simulation data were available for 85 subjects (82.5% of trained recruits) who had been released and resided in the community. Subjects’ simulation checklist median score was 12.0 (IQR 11.0–15.0) out of 21 total indicated actions. Forty-four participants (51.8%) correctly administered naloxone; 16 additional subjects (18.8%) suboptimally administered naloxone. Non-indicated actions, primarily chest compressions, were observed in 49.4% of simulations. Simulated resuscitative actions by 80 subjects (94.1%) were determined post-hoc to be beneficial overall for patients overdosing on opioids. Conclusions As part of an opioid OD prevention research program for at-risk inmates, investigators applied simulation to 1-month follow-up assessments of knowledge retention and skills acquisition in post-release participants. Simulation supplemented traditional research tools for investigation of layperson OD management. PMID:28146450
Thompson, Matthew P; Scott, Joe; Helmbrecht, Don; Calkin, Dave E
2013-04-01
The financial, socioeconomic, and ecological impacts of wildfire continue to challenge federal land management agencies in the United States. In recent years, policymakers and managers have increasingly turned to the field of risk analysis to better manage wildfires and to mitigate losses to highly valued resources and assets (HVRAs). Assessing wildfire risk entails the interaction of multiple components, including integrating wildfire simulation outputs with geospatial identification of HVRAs and the characterization of fire effects to HVRAs. We present an integrated and systematic risk assessment framework that entails 3 primary analytical components: 1) stochastic wildfire simulation and burn probability modeling to characterize wildfire hazard, 2) expert-based modeling to characterize fire effects, and 3) multicriteria decision analysis to characterize preference structures across at-risk HVRAs. We demonstrate application of this framework for a wildfire risk assessment performed on the Little Belts Assessment Area within the Lewis and Clark National Forest in Montana, United States. We devote particular attention to our approach to eliciting and encapsulating expert judgment, in which we: 1) adhered to a structured process for using expert judgment in ecological risk assessment, 2) used as our expert base local resource scientists and fire/fuels specialists who have a direct connection to the specific landscape and HVRAs in question, and 3) introduced multivariate response functions to characterize fire effects to HVRAs that consider biophysical variables beyond fire behavior. We anticipate that this work will further the state of wildfire risk science and will lead to additional application of risk assessment to inform land management planning. Copyright © 2012 SETAC.
Johansson, Michael A.; Arana-Vizcarrondo, Neysarí; Biggerstaff, Brad J.; Gallagher, Nancy; Marano, Nina; Staples, J. Erin
2012-01-01
Yellow fever virus (YFV), a mosquito-borne virus endemic to tropical Africa and South America, is capable of causing large urban outbreaks of human disease. With the ease of international travel, urban outbreaks could lead to the rapid spread and subsequent transmission of YFV in distant locations. We designed a stochastic metapopulation model with spatiotemporally explicit transmissibility scenarios to simulate the global spread of YFV from a single urban outbreak by infected airline travelers. In simulations of a 2008 outbreak in Asunción, Paraguay, local outbreaks occurred in 12.8% of simulations and international spread in 2.0%. Using simple probabilistic models, we found that local incidence, travel rates, and basic transmission parameters are sufficient to assess the probability of introduction and autochthonous transmission events. These models could be used to assess the risk of YFV spread during an urban outbreak and identify locations at risk for YFV introduction and subsequent autochthonous transmission. PMID:22302873
Johansson, Michael A; Arana-Vizcarrondo, Neysarí; Biggerstaff, Brad J; Gallagher, Nancy; Marano, Nina; Staples, J Erin
2012-02-01
Yellow fever virus (YFV), a mosquito-borne virus endemic to tropical Africa and South America, is capable of causing large urban outbreaks of human disease. With the ease of international travel, urban outbreaks could lead to the rapid spread and subsequent transmission of YFV in distant locations. We designed a stochastic metapopulation model with spatiotemporally explicit transmissibility scenarios to simulate the global spread of YFV from a single urban outbreak by infected airline travelers. In simulations of a 2008 outbreak in Asunción, Paraguay, local outbreaks occurred in 12.8% of simulations and international spread in 2.0%. Using simple probabilistic models, we found that local incidence, travel rates, and basic transmission parameters are sufficient to assess the probability of introduction and autochthonous transmission events. These models could be used to assess the risk of YFV spread during an urban outbreak and identify locations at risk for YFV introduction and subsequent autochthonous transmission.
Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments
NASA Technical Reports Server (NTRS)
Manning, Ted A.; Lawrence, Scott L.
2014-01-01
As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babendreier, Justin E.; Castleton, Karl J.
2005-08-01
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less
Assessment of the risk due to release of carbon fiber in civil aircraft accidents, phase 2
NASA Technical Reports Server (NTRS)
Pocinki, L.; Cornell, M. E.; Kaplan, L.
1980-01-01
The risk associated with the potential use of carbon fiber composite material in commercial jet aircraft is investigated. A simulation model developed to generate risk profiles for several airports is described. The risk profiles show the probability that the cost due to accidents in any year exceeds a given amount. The computer model simulates aircraft accidents with fire, release of fibers, their downwind transport and infiltration of buildings, equipment failures, and resulting ecomomic impact. The individual airport results were combined to yield the national risk profile.
Simulating Limb Formation in the U.S. EPA Virtual Embryo - Risk Assessment Project
The U.S. EPA’s Virtual Embryo project (v-Embryo™) is a computer model simulation of morphogenesis that integrates cell and molecular level data from mechanistic and in vitro assays with knowledge about normal development processes to assess in silico the effects of chemicals on d...
Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore
2014-04-01
Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.
Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore
2014-01-01
Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325
Wildfire risk assessment in a typical Mediterranean wildland-urban interface of Greece.
Mitsopoulos, Ioannis; Mallinis, Giorgos; Arianoutsou, Margarita
2015-04-01
The purpose of this study was to assess spatial wildfire risk in a typical Mediterranean wildland-urban interface (WUI) in Greece and the potential effect of three different burning condition scenarios on the following four major wildfire risk components: burn probability, conditional flame length, fire size, and source-sink ratio. We applied the Minimum Travel Time fire simulation algorithm using the FlamMap and ArcFuels tools to characterize the potential response of the wildfire risk to a range of different burning scenarios. We created site-specific fuel models of the study area by measuring the field fuel parameters in representative natural fuel complexes, and we determined the spatial extent of the different fuel types and residential structures in the study area using photointerpretation procedures of large scale natural color orthophotographs. The results included simulated spatially explicit fire risk components along with wildfire risk exposure analysis and the expected net value change. Statistical significance differences in simulation outputs between the scenarios were obtained using Tukey's significance test. The results of this study provide valuable information for decision support systems for short-term predictions of wildfire risk potential and inform wildland fire management of typical WUI areas in Greece.
Wildfire Risk Assessment in a Typical Mediterranean Wildland-Urban Interface of Greece
NASA Astrophysics Data System (ADS)
Mitsopoulos, Ioannis; Mallinis, Giorgos; Arianoutsou, Margarita
2015-04-01
The purpose of this study was to assess spatial wildfire risk in a typical Mediterranean wildland-urban interface (WUI) in Greece and the potential effect of three different burning condition scenarios on the following four major wildfire risk components: burn probability, conditional flame length, fire size, and source-sink ratio. We applied the Minimum Travel Time fire simulation algorithm using the FlamMap and ArcFuels tools to characterize the potential response of the wildfire risk to a range of different burning scenarios. We created site-specific fuel models of the study area by measuring the field fuel parameters in representative natural fuel complexes, and we determined the spatial extent of the different fuel types and residential structures in the study area using photointerpretation procedures of large scale natural color orthophotographs. The results included simulated spatially explicit fire risk components along with wildfire risk exposure analysis and the expected net value change. Statistical significance differences in simulation outputs between the scenarios were obtained using Tukey's significance test. The results of this study provide valuable information for decision support systems for short-term predictions of wildfire risk potential and inform wildland fire management of typical WUI areas in Greece.
2007-04-01
judgmental self-doubt, depression, and causal uncertainty, tend to take fewer risks, and have lower self-esteem. Results from two studies (Nygren, 2000...U.S. Army Research Institute for the Behavioral and Social Sciences Research Report 1869 Assessment of Two Desk-Top Computer Simulations Used to...SUBTITLE 5a. CONTRACT OR GRANT NUMBER Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit
Quantitative risk assessment of E. coli in street-vended cassava-based delicacies in the Philippines
NASA Astrophysics Data System (ADS)
Mesias, I. C. P.
2018-01-01
In the Philippines, rootcrop-based food products are gaining popularity in street food trade. However, a number of street-vended food products in the country are reported to be contaminated with E. coli posing possible risk among consumers. In this study, information on quantitative risk assessment of E. coli in street-vended cassava-based delicacies was generated. The assessment started with the prevalence and concentration of E. coli at post production in packages of the cassava-based delicacies. Combase growth predictor was used to trace the microbial population of E. coli in each step of the food chain. The @Risk software package, version 6 (Palisade USA) was used to run the simulations. Scenarios in the post-production to consumption pathway were simulated. The effect was then assessed in relation to exposure to the defined infective dose. In the worst case scenario, a minimum and most likely concentration of 6.3 and 7.8 log CFU of E. coli per serving respectively were observed. The simulation revealed that lowering the temperature in the chain considerably decreased the E. coli concentration prior to consumption and subsequently decreased the percentage of exposure to the infective dose. Exposure to infective dose however was increased with longer lag time from postproduction to consumption.
Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin
We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...
Sequential use of simulation and optimization in analysis and planning
Hans R. Zuuring; Jimmie D. Chew; J. Greg Jones
2000-01-01
Management activities are analyzed at landscape scales employing both simulation and optimization. SIMPPLLE, a stochastic simulation modeling system, is initially applied to assess the risks associated with a specific natural process occurring on the current landscape without management treatments, but with fire suppression. These simulation results are input into...
Abbasi, Mitra; Small, Ben G; Patel, Nikunjkumar; Jamei, Masoud; Polak, Sebastian
2017-02-01
To determine the predictive performance of in silico models using drug-specific preclinical cardiac electrophysiology data to investigate drug-induced arrhythmia risk (e.g. Torsade de pointes (TdP)) in virtual human subjects. To assess drug proarrhythmic risk, we used a set of in vitro electrophysiological measurements describing ion channel inhibition triggered by the investigated drugs. The Cardiac Safety Simulator version 2.0 (CSS; Simcyp, Sheffield, UK) platform was used to simulate human left ventricular cardiac myocyte action potential models. This study shows the impact of drug concentration changes on particular ionic currents by using available experimental data. The simulation results display safety threshold according to drug concentration threshold and log (threshold concentration/ effective therapeutic plasma concentration (ETPC)). We reproduced the underlying biophysical characteristics of cardiac cells resulted in effects of drugs associated with cardiac arrhythmias (action potential duration (APD) and QT prolongation and TdP) which were observed in published 3D simulations, yet with much less computational burden.
Demirdöğen Çetinoğlu, Ezgi; Görek Dilektaşlı, Aslı; Demir, Nefise Ateş; Özkaya, Güven; Acet, Nilüfer Aylin; Durmuş, Eda; Ursavaş, Ahmet; Karadağ, Mehmet; Ege, Ercüment
2015-09-01
Driving performance is known to be very sensitive to cognitive-psychomotor impairment. The aim of the study was to determine the relationship between obesity, risk of obstructive sleep apnoea (OSA), daytime sleepiness, history of road traffic accident (RTA) and performance on a driving simulator, among commercial drivers. We examined commercial vehicle drivers admitted to Psycho-Technical Assessment System (PTAS), which is a computer-aided system that includes a driving simulator test and tests assessing psychomotor-cognitive skills required for driving. Risk of OSA and daytime sleepiness were assessed by the Berlin Questionnaire and the Epworth Sleepiness Scale (ESS), respectively. A total of 282 commercial vehicle drivers were consecutively enrolled. The age range was 29-76 years. Thirty drivers were at high risk of OSA. Median ESS of the group was 2 (0-20). Forty-seven percent of the subjects at high risk of OSA failed in early reaction time test, while 28% of the drivers with low risk of OSA failed (p = 0.03). The obese drivers failed the peripheral vision test when compared with non-obese drivers (p = 0.02). ESS was higher for drivers with a history of RTA when compared to those without RTA (p = 0.02). Cognitive-psychomotor functions can be impaired in obese and high risk of OSA patients. In our opinion, requiring obese and/or high risk of OSA drivers to take PTAS tests that assess driving skills and psychomotor-cognitive functions crucial to those skills would significantly improve road traffic safety, which is of considerable importance to public health.
Hydrological Modelling using HEC-HMS for Flood Risk Assessment of Segamat Town, Malaysia
NASA Astrophysics Data System (ADS)
Romali, N. S.; Yusop, Z.; Ismail, A. Z.
2018-03-01
This paper presents an assessment of the applicability of using Hydrologic Modelling System developed by the Hydrologic Engineering Center (HEC-HMS) for hydrological modelling of Segamat River. The objective of the model application is to assist in the assessment of flood risk by providing the peak flows of 2011 Segamat flood for the generation of flood mapping of Segamat town. The capability of the model was evaluated by comparing the historical observed data with the simulation results of the selected flood events. The model calibration and validation efficiency was verified using Nash-Sutcliffe model efficiency coefficient. The results demonstrate the interest to implement the hydrological model for assessing flood risk where the simulated peak flow result is in agreement with historical observed data. The model efficiency of the calibrated and validated exercises is 0.90 and 0.76 respectively, which is acceptable.
Simulation Assisted Risk Assessment Applied to Launch Vehicle Conceptual Design
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Go, Susie; Gee, Ken; Lawrence, Scott
2008-01-01
A simulation-based risk assessment approach is presented and is applied to the analysis of abort during the ascent phase of a space exploration mission. The approach utilizes groupings of launch vehicle failures, referred to as failure bins, which are mapped to corresponding failure environments. Physical models are used to characterize the failure environments in terms of the risk due to blast overpressure, resulting debris field, and the thermal radiation due to a fireball. The resulting risk to the crew is dynamically modeled by combining the likelihood of each failure, the severity of the failure environments as a function of initiator and time of the failure, the robustness of the crew module, and the warning time available due to early detection. The approach is shown to support the launch vehicle design process by characterizing the risk drivers and identifying regions where failure detection would significantly reduce the risk to the crew.
NASA Astrophysics Data System (ADS)
Rosso, R.; Rulli, M. C.
The influence of land use changes on flood occurrence and severity in the Bisagno River (Thyrrenian Liguria, N.W. Italy is investigated using a Monte Carlo simulation approach (Rulli and Rosso, 2002). High resolution land-use maps for the area were reconstructed and scenario simulations were made for a pre-industrial (1878), an intermediate (1930) and a current (1980) year. Land-use effects were explored to assess the consequences of distributed changes in land use due to agricultural practice and urbanisation. Hydraulic conveyance effects were considered, to assess the consequences of channel modifications associated with engineering works in the lower Bisagno River network. Flood frequency analyses of the annual flood series, retrieved from the simulations, were used to examine the effect of land-use change and river conveyance on flood regime. The impact of these effects proved to be negligible in the upper Bisagno River, moderate in the downstream river and severe in the small tributaries in the lower Bisagno valley that drain densely populated urban areas. The simulation approach is shown to be capable of incorporating historical data on landscape and river patterns into quantitative methods for risk assessment.
Assessment and management of the performance risk of a pilot reclaimed water disinfection process.
Zhou, Guangyu; Zhao, Xinhua; Zhang, Lei; Wu, Qing
2013-10-01
Chlorination disinfection has been widely used in reclaimed water treatment plants to ensure water quality. In order to assess the downstream quality risk of a running reclaimed water disinfection process, a set of dynamic equations was developed to simulate reactions in the disinfection process concerning variables of bacteria, chemical oxygen demand (COD), ammonia and monochloramine. The model was calibrated by the observations obtained from a pilot disinfection process which was designed to simulate the actual process in a reclaimed water treatment plant. A Monte Carlo algorithm was applied to calculate the predictive effluent quality distributions that were used in the established hierarchical assessment system for the downstream quality risk, and the key factors affecting the downstream quality risk were defined using the Regional Sensitivity Analysis method. The results showed that the seasonal upstream quality variation caused considerable downstream quality risk; the effluent ammonia was significantly influenced by its upstream concentration; the upstream COD was a key factor determining the process effluent risk of bacterial, COD and residual disinfectant indexes; and lower COD and ammonia concentrations in the influent would mean better downstream quality.
EVALUATION OF PHYSIOLOGY COMPUTER MODELS, AND THE FEASIBILITY OF THEIR USE IN RISK ASSESSMENT.
This project will evaluate the current state of quantitative models that simulate physiological processes, and the how these models might be used in conjunction with the current use of PBPK and BBDR models in risk assessment. The work will include a literature search to identify...
Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses
Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan
2016-01-01
This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens foodborne illness by consumption cheese is low, and it can be used to establish microbial criteria for C. perfringens on natural and processed cheeses. PMID:26954204
NASA Astrophysics Data System (ADS)
Doroszkiewicz, J. M.; Romanowicz, R. J.
2016-12-01
The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.
Green, Linda E; Dinh, Tuan A; Hinds, David A; Walser, Bryan L; Allman, Richard
2014-04-01
Tamoxifen therapy reduces the risk of breast cancer but increases the risk of serious adverse events including endometrial cancer and thromboembolic events. The cost effectiveness of using a commercially available breast cancer risk assessment test (BREVAGen™) to inform the decision of which women should undergo chemoprevention by tamoxifen was modeled in a simulated population of women who had undergone biopsies but had no diagnosis of cancer. A continuous time, discrete event, mathematical model was used to simulate a population of white women aged 40-69 years, who were at elevated risk for breast cancer because of a history of benign breast biopsy. Women were assessed for clinical risk of breast cancer using the Gail model and for genetic risk using a panel of seven common single nucleotide polymorphisms. We evaluated the cost effectiveness of using genetic risk together with clinical risk, instead of clinical risk alone, to determine eligibility for 5 years of tamoxifen therapy. In addition to breast cancer, the simulation included health states of endometrial cancer, pulmonary embolism, deep-vein thrombosis, stroke, and cataract. Estimates of costs in 2012 US dollars were based on Medicare reimbursement rates reported in the literature and utilities for modeled health states were calculated as an average of utilities reported in the literature. A 50-year time horizon was used to observe lifetime effects including survival benefits. For those women at intermediate risk of developing breast cancer (1.2-1.66 % 5-year risk), the incremental cost-effectiveness ratio for the combined genetic and clinical risk assessment strategy over the clinical risk assessment-only strategy was US$47,000, US$44,000, and US$65,000 per quality-adjusted life-year gained, for women aged 40-49, 50-59, and 60-69 years, respectively (assuming a price of US$945 for genetic testing). Results were sensitive to assumptions about patient adherence, utility of life while taking tamoxifen, and cost of genetic testing. From the US payer's perspective, the combined genetic and clinical risk assessment strategy may be a moderately cost-effective alternative to using clinical risk alone to guide chemoprevention recommendations for women at intermediate risk of developing breast cancer.
Gorrindo, Tristan; Goldfarb, Elizabeth; Birnbaum, Robert J; Chevalier, Lydia; Meller, Benjamin; Alpert, Jonathan; Herman, John; Weiss, Anthony
2013-07-01
Ongoing professional practice evaluation (OPPE) activities consist of a quantitative, competency-based evaluation of clinical performance. Hospitals must design assessments that measure clinical competencies, are scalable, and minimize impact on the clinician's daily routines. A psychiatry department at a large academic medical center designed and implemented an interactive Web-based psychiatric simulation focusing on violence risk assessment as a tool for a departmentwide OPPE. Of 412 invited clinicians in a large psychiatry department, 410 completed an online simulation in April-May 2012. Participants received scheduled e-mail reminders with instructions describing how to access the simulation. Using the Computer Simulation Assessment Tool, participants viewed an introductory video and were then asked to conduct a risk assessment, acting as a clinician in the encounter by selecting actions from a series of drop-down menus. Each action was paired with a corresponding video segment of a clinical encounter with a standardized patient. Participants were scored on the basis of their actions within the simulation (Measure 1) and by their responses to the open-ended questions in which they were asked to integrate the information from the simulation in a summative manner (Measure 2). Of the 410 clinicians, 381 (92.9%) passed Measure 1,359 (87.6%) passed Measure 2, and 5 (1.2%) failed both measures. Seventy-five (18.3%) participants were referred for focused professional practice evaluation (FPPE) after failing either Measure 1, Measure 2, or both. Overall, Web-based simulation and e-mail engagement tools were a scalable and efficient way to assess a large number of clinicians in OPPE and to identify those who required FPPE.
Risk assessments: Validation, gut feeling and cognitive biases (Plinius Medal Lecture)
NASA Astrophysics Data System (ADS)
Merz, Bruno
2017-04-01
Risk management is ideally based on comprehensive risk assessments quantifying the current risk and its reduction for different mitigation strategies. Given the pivotal role of risk assessments, this contribution discusses the basis for our confidence in risk assessments. Traditional validation, i.e. comparing model simulations with past observations, is often not possible since the assessment typically contains extreme events and their impacts that have not been observed before. In this situation, the assessment is strongly based on assumptions, expert judgement and best guess. This is an unfavorable situation as humans fall prey to cognitive biases, such as 'illusion of certainty', 'overconfidence' or 'recency bias'. Such biases operate specifically in complex situations with many factors involved, when uncertainty is high and events are probabilistic, or when close learning feedback loops are missing - aspects that all apply to risk assessments. We reflect on the role of gut feeling in risk assessments, illustrate the pitfalls of cognitive biases, and discuss the possibilities for better understanding how confident we can be in the numbers resulting from risk assessments.
USER MANUAL FOR EXPRESS, THE EXAMS-PRZM EXPOSURE SIMULATION SHELL
The Environmental Fate and Effects Division (EFED) of EPA's Office of Pesticide Programs(OPP) uses a suite of ORD simulation models for the exposure analysis portion of regulatory risk assessments. These models (PRZM, EXAMS, AgDisp) are complex, process-based simulation codes tha...
POLICY ISSUES ASSOCIATED WITH USING SIMULATION TO ASSESS ENVIRONMENTAL IMPACTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uchitel, Kirsten; Tanana, Heather
This report examines the relationship between simulation-based science and judicial assessments of simulations or models supporting evaluations of environmental harms or risks, considering both how it exists currently and how it might be shaped in the future. This report considers the legal standards relevant to judicial assessments of simulation-based science and provides examples of the judicial application of those legal standards. Next, this report discusses the factors that inform whether there is a correlation between the sophistication of a challenged simulation and judicial support for that simulation. Finally, this report examines legal analysis of the broader issues that must bemore » addressed for simulation-based science to be better understood and utilized in the context of judicial challenge and evaluation. !« less
NASA Astrophysics Data System (ADS)
Jin, G.
2015-12-01
Subsurface storage of carbon dioxide in geological formations is widely regarded as a promising tool for reducing global atmospheric CO2 emissions. Successful geologic storage for sequestrated carbon dioxides must prove to be safe by means of risk assessments including post-injection analysis of injected CO2 plumes. Because fractured reservoirs exhibit a higher degree of heterogeneity, it is imperative to conduct such simulation studies in order to reliably predict the geometric evolution of plumes and risk assessment of post CO2injection. The research has addressed the pressure footprint of CO2 plumes through the development of new techniques which combine discrete fracture network and stochastic continuum modeling of multiphase flow in fractured geologic formations. A subsequent permeability tensor map in 3-D, derived from our preciously developed method, can accurately describe the heterogeneity of fracture reservoirs. A comprehensive workflow integrating the fracture permeability characterization and multiphase flow modeling has been developed to simulate the CO2plume migration and risk assessments. A simulated fractured reservoir model based on high-priority geological carbon sinks in central Alabama has been employed for preliminary study. Discrete fracture networks were generated with an NE-oriented regional fracture set and orthogonal NW-fractures. Fracture permeability characterization revealed high permeability heterogeneity with an order of magnitude of up to three. A multiphase flow model composed of supercritical CO2 and saline water was then applied to predict CO2 plume volume, geometry, pressure footprint, and containment during and post injection. Injection simulation reveals significant permeability anisotropy that favors development of northeast-elongate CO2 plumes, which are aligned with systematic fractures. The diffusive spreading front of the CO2 plume shows strong viscous fingering effects. Post-injection simulation indicates significant upward lateral spreading of CO2 resulting in accumulation of CO2 directly under the seal unit because of its buoyancy and strata-bound vertical fractures. Risk assessment shows that lateral movement of CO2 along interconnected fractures requires widespread seals with high integrity to confine the injected CO2.
Bayatian, Majid; Ashrafi, Khosro; Azari, Mansour Rezazadeh; Jafari, Mohammad Javad; Mehrabi, Yadollah
2018-04-01
There has been an increasing concern about the continuous and the sudden release of volatile organic pollutants from petroleum refineries and occupational and environmental exposures. Benzene is one of the most prevalent volatile compounds, and it has been addressed by many authors for its potential toxicity in occupational and environmental settings. Due to the complexities of sampling and analysis of benzene in routine and accidental situations, a reliable estimation of the benzene concentration in the outdoor setting of refinery using a computational fluid dynamics (CFD) could be instrumental for risk assessment of occupational exposure. In the present work, a computational fluid dynamic model was applied for exposure risk assessment with consideration of benzene being released continuously from a reforming unit of a refinery. For simulation of benzene dispersion, GAMBIT, FLUENT, and CFD post software are used as preprocessing, processing, and post-processing, respectively. Computational fluid dynamic validation was carried out by comparing the computed data with the experimental measurements. Eventually, chronic daily intake and lifetime cancer risk for routine operations through the two seasons of a year are estimated through the simulation model. Root mean square errors are 0.19 and 0.17 for wind speed and concentration, respectively. Lifetime risk assessments of workers are 0.4-3.8 and 0.0096-0.25 per 1000 workers in stable and unstable atmospheric conditions, respectively. Exposure risk is unacceptable for the head of shift work, chief engineer, and general workers in 141 days (38.77%) in a year. The results of this study show that computational fluid dynamics is a useful tool for modeling of benzene exposure in a complex geometry and can be used to estimate lifetime risks of occupation groups in a refinery setting.
Assessing climate impacts and risks of ocean albedo modification in the Arctic
NASA Astrophysics Data System (ADS)
Mengis, N.; Martin, T.; Keller, D. P.; Oschlies, A.
2016-05-01
The ice albedo feedback is one of the key factors of accelerated temperature increase in the high northern latitudes under global warming. This study assesses climate impacts and risks of idealized Arctic Ocean albedo modification (AOAM), a proposed climate engineering method, during transient climate change simulations with varying representative concentration pathway (RCP) scenarios. We find no potential for reversing trends in all assessed Arctic climate metrics under increasing atmospheric CO2 concentrations. AOAM only yields an initial offset during the first years after implementation. Nevertheless, sea ice loss can be delayed by 25(60) years in the RCP8.5(RCP4.5) scenario and the delayed thawing of permafrost soils in the AOAM simulations prevents up to 40(32) Pg of carbon from being released by 2100. AOAM initially dampens the decline of the Atlantic Meridional Overturning and delays the onset of open ocean deep convection in the Nordic Seas under the RCP scenarios. Both these processes cause a subsurface warming signal in the AOAM simulations relative to the default RCP simulations with the potential to destabilize Arctic marine gas hydrates. Furthermore, in 2100, the RCP8.5 AOAM simulation diverts more from the 2005-2015 reference state in many climate metrics than the RCP4.5 simulation without AOAM. Considering the demonstrated risks, we conclude that concerning longer time scales, reductions in emissions remain the safest and most effective way to prevent severe changes in the Arctic.
Zolezzi, Monica; Abdallah, Oraib; Kheir, Nadir; Abdelsalam, Abdelsalam Gomaa
2018-04-28
Individuals who suffer from major cardiovascular events every year have one or more risk factors. Cardiovascular disease (CVD) risk assessment is an important strategy for the early identification of modifiable risk factors and their management. There is substantial evidence that shifting the focus from treatment to primary prevention reduces the burden of CVD. To evaluate the preparedness of community pharmacists in Qatar for the provision of CVD risk assessment and management services; and to explore the pharmacists' views on the provision of these services. A cross-sectional study using simulated-client methodology. Using standardized scenarios, community pharmacists were approached for consultation on two medicines (Aspirin ® and Crestor ® ) used for managing specific CVD risk factors. Pharmacists' competency to assess CVD risk was the primary outcome evaluated. Scores for each outcome were obtained based on the number of predefined statements addressed during the consultation. The mean cumulative score for all the competency outcomes assessed was 11.7 (SD 3.7) out of a possible score of 31. There were no differences for the majority of the competencies tested between the two scenarios used. Significantly more pharmacists exposed to the Aspirin ® scenario than to the Crestor ® scenario addressed hypertension as one of the risk factors needed to assess CVD risk (22% versus 11%, p = 0.03); whereas significantly more pharmacists in the Crestor ® scenario compared to the Aspirin ® scenario, addressed dyslipidemia as one of the risk factors needed to assess CVD risk (30% versus 7%, p = 0.02). Significantly more pharmacists exposed to the Aspirin ® scenario provided explanation about CVD risk than those exposed to the Crestor ® scenario 36% versus 8%, p < 0.01). The results suggest that many community pharmacists in Qatar are not displaying competencies that are necessary for the provision of CVD prevention services. Copyright © 2018 Elsevier Inc. All rights reserved.
Comparison of a Virtual Older Driver Assessment with an On-Road Driving Test.
Eramudugolla, Ranmalee; Price, Jasmine; Chopra, Sidhant; Li, Xiaolan; Anstey, Kaarin J
2016-12-01
To design a low-cost simulator-based driving assessment for older adults and to compare its validity with that of an on-road driving assessment and other measures of older driver risk. Cross-sectional observational study. Canberra, Australia. Older adult drivers (N = 47; aged 65-88, mean age 75.2). Error rate on a simulated drive with environment and scoring procedure matched to those of an on-road test. Other measures included participant age, simulator sickness severity, neuropsychological measures, and driver screening measures. Outcome variables included occupational therapist (OT)-rated on-road errors, on-road safety rating, and safety category. Participants' error rate on the simulated drive was significantly correlated with their OT-rated driving safety (correlation coefficient (r) = -0.398, P = .006), even after adjustment for age and simulator sickness (P = .009). The simulator error rate was a significant predictor of categorization as unsafe on the road (P = .02, sensitivity 69.2%, specificity 100%), with 13 (27%) drivers assessed as unsafe. Simulator error was also associated with other older driver safety screening measures such as useful field of view (r = 0.341, P = .02), DriveSafe (r = -0.455, P < .01), and visual motion sensitivity (r = 0.368, P = .01) but was not associated with memory (delayed word recall) or global cognition (Mini-Mental State Examination). Drivers made twice as many errors on the simulated assessment as during the on-road assessment (P < .001), with significant differences in the rate and type of errors between the two mediums. A low-cost simulator-based assessment is valid as a screening instrument for identifying at-risk older drivers but not as an alternative to on-road evaluation when accurate data on competence or pattern of impairment is required for licensing decisions and training programs. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.
Schmitt, Walter; Auteri, Domenica; Bastiansen, Finn; Ebeling, Markus; Liu, Chun; Luttik, Robert; Mastitsky, Sergey; Nacci, Diane; Topping, Chris; Wang, Magnus
2016-01-01
This article presents a case study demonstrating the application of 3 individual-based, spatially explicit population models (IBMs, also known as agent-based models) in ecological risk assessments to predict long-term effects of a pesticide to populations of small mammals. The 3 IBMs each used a hypothetical fungicide (FungicideX) in different scenarios: spraying in cereals (common vole, Microtus arvalis), spraying in orchards (field vole, Microtus agrestis), and cereal seed treatment (wood mouse, Apodemus sylvaticus). Each scenario used existing model landscapes, which differed greatly in size and structural complexity. The toxicological profile of FungicideX was defined so that the deterministic long-term first tier risk assessment would result in high risk to small mammals, thus providing the opportunity to use the IBMs for risk assessment refinement (i.e., higher tier risk assessment). Despite differing internal model design and scenarios, results indicated in all 3 cases low population sensitivity unless FungicideX was applied at very high (×10) rates. Recovery from local population impacts was generally fast. Only when patch extinctions occured in simulations of intentionally high acute toxic effects, recovery periods, then determined by recolonization, were of any concern. Conclusions include recommendations for the most important input considerations, including the selection of exposure levels, duration of simulations, statistically robust number of replicates, and endpoints to report. However, further investigation and agreement are needed to develop recommendations for landscape attributes such as size, structure, and crop rotation to define appropriate regulatory risk assessment scenarios. Overall, the application of IBMs provides multiple advantages to higher tier ecological risk assessments for small mammals, including consistent and transparent direct links to specific protection goals, and the consideration of more realistic scenarios. © 2015 SETAC.
Simulation as a learning strategy: supporting undergraduate nursing students with disabilities.
Azzopardi, Toni; Johnson, Amanda; Phillips, Kirrilee; Dickson, Cathy; Hengstberger-Sims, Cecily; Goldsmith, Mary; Allan, Trevor
2014-02-01
To promote simulation as a learning strategy to support undergraduate nursing students with disabilities. Supporting undergraduate nursing students with disabilities has gained further momentum because of amendments to the Disability Discrimination Act in 2009. Providers of higher education must now ensure proactive steps to prevent discrimination against students with a disability are implemented to assist in course progression. Simulation allows for the impact of a student's disability to be assessed and informs the determination of reasonable adjustments to be implemented. Further suitable adjustments can then be determined in a safe environment and evaluated prior to scheduled placement. Auditing in this manner, offers a risk management strategy for all while maintaining the academic integrity of the program. Discursive. Low, medium and high fidelity simulation activities critically analysed and their application to support undergraduate nursing students with disabilities assessed. With advancing technology and new pedagogical approaches simulation as a learning strategy can play a significant role. In this role, simulation supports undergraduate nursing students with disabilities to meet course requirements, while offering higher education providers an important risk management strategy. The discussion recommends simulation is used to inform the determination of reasonable adjustments for undergraduate nursing students with disabilities as an effective, contemporary curriculum practice. Adoption of simulation, in this way, will meet three imperatives: comply with current legislative requirements, embrace advances in learning technologies and embed one of the six principles of inclusive curriculum. Achieving these imperatives is likely to increase accessibility for all students and offer students with a disability a supportive learning experience. Provides capacity to systematically assess, monitor, evaluate and support students with a disability. The students' reasonable adjustments can be determined prior to attending clinical practice to minimise risks and ensure the safety of all. © 2013 Blackwell Publishing Ltd.
Torres, Luisa; Yadav, Om Prakash; Khan, Eakalak
2017-02-01
A holistic risk assessment of surface water (SW) contamination due to lead-210 (Pb-210) in oil produced water (PW) from the Bakken Shale in North Dakota (ND) was conducted. Pb-210 is a relatively long-lived radionuclide and very mobile in water. Because of limited data on Pb-210, a simulation model was developed to determine its concentration based on its parent radium-226 and historical total dissolved solids levels in PW. Scenarios where PW spills could reach SW were analyzed by applying the four steps of the risk assessment process. These scenarios are: (1) storage tank overflow, (2) leakage in equipment, and (3) spills related to trucks used to transport PW. Furthermore, a survey was conducted in ND to quantify the risk perception of PW from different stakeholders. Findings from the study include a low probability of a PW spill reaching SW and simulated concentration of Pb-210 in drinking water higher than the recommended value established by the World Health Organization. Also, after including the results from the risk perception survey, the assessment indicates that the risk of contamination of the three scenarios evaluated is between medium-high to high. Copyright © 2016 Elsevier Ltd. All rights reserved.
Han, Bin; Liu, Yating; You, Yan; Xu, Jia; Zhou, Jian; Zhang, Jiefeng; Niu, Can; Zhang, Nan; He, Fei; Ding, Xiao; Bai, Zhipeng
2016-10-01
Assessment of the health risks resulting from exposure to ambient polycyclic aromatic hydrocarbons (PAHs) is limited by the lack of environmental exposure data among different subpopulations. To assess the exposure cancer risk of particulate carcinogenic polycyclic aromatic hydrocarbon pollution for the elderly, this study conducted a personal exposure measurement campaign for particulate PAHs in a community of Tianjin, a city in northern China. Personal exposure samples were collected from the elderly in non-heating (August-September, 2009) and heating periods (November-December, 2009), and 12 PAHs individuals were analyzed for risk estimation. Questionnaire and time-activity log were also recorded for each person. The probabilistic risk assessment model was integrated with Toxic Equivalent Factors (TEFs). Considering that the estimation of the applied dose for a given air pollutant is dependent on the inhalation rate, the inhalation rate from both EPA exposure factor book was applied to calculate the carcinogenic risk in this study. Monte Carlo simulation was used as a probabilistic risk assessment model, and risk simulation results indicated that the inhalation-ILCR values for both male and female subjects followed a lognormal distribution with a mean of 4.81 × 10 -6 and 4.57 × 10 -6 , respectively. Furthermore, the 95 % probability lung cancer risks were greater than the USEPA acceptable level of 10 -6 for both men and women through the inhalation route, revealing that exposure to PAHs posed an unacceptable potential cancer risk for the elderly in this study. As a result, some measures should be taken to reduce PAHs pollution and the exposure level to decrease the cancer risk for the general population, especially for the elderly.
Mishra, Harshit; Karmakar, Subhankar; Kumar, Rakesh; Singh, Jitendra
2017-07-01
Landfilling is a cost-effective method, which makes it a widely used practice around the world, especially in developing countries. However, because of the improper management of landfills, high leachate leakage can have adverse impacts on soils, plants, groundwater, aquatic organisms, and, subsequently, human health. A comprehensive survey of the literature finds that the probabilistic quantification of uncertainty based on estimations of the human health risks due to landfill leachate contamination has rarely been reported. Hence, in the present study, the uncertainty about the human health risks from municipal solid waste landfill leachate contamination to children and adults was quantified to investigate its long-term risks by using a Monte Carlo simulation framework for selected heavy metals. The Turbhe sanitary landfill of Navi Mumbai, India, which was commissioned in the recent past, was selected to understand the fate and transport of heavy metals in leachate. A large residential area is located near the site, which makes the risk assessment problem both crucial and challenging. In this article, an integral approach in the form of a framework has been proposed to quantify the uncertainty that is intrinsic to human health risk estimation. A set of nonparametric cubic splines was fitted to identify the nonlinear seasonal trend in leachate quality parameters. LandSim 2.5, a landfill simulator, was used to simulate the landfill activities for various time slices, and further uncertainty in noncarcinogenic human health risk was estimated using a Monte Carlo simulation followed by univariate and multivariate sensitivity analyses. © 2016 Society for Risk Analysis.
Wildfire risk in the wildland-urban interface: A simulation study in northwestern Wisconsin
Bar-Massada, A.; Radeloff, V.C.; Stewart, S.I.; Hawbaker, T.J.
2009-01-01
The rapid growth of housing in and near the wildland-urban interface (WUI) increases wildfire risk to lives and structures. To reduce fire risk, it is necessary to identify WUI housing areas that are more susceptible to wildfire. This is challenging, because wildfire patterns depend on fire behavior and spread, which in turn depend on ignition locations, weather conditions, the spatial arrangement of fuels, and topography. The goal of our study was to assess wildfire risk to a 60,000 ha WUI area in northwestern Wisconsin while accounting for all of these factors. We conducted 6000 simulations with two dynamic fire models: Fire Area Simulator (FARSITE) and Minimum Travel Time (MTT) in order to map the spatial pattern of burn probabilities. Simulations were run under normal and extreme weather conditions to assess the effect of weather on fire spread, burn probability, and risk to structures. The resulting burn probability maps were intersected with maps of structure locations and land cover types. The simulations revealed clear hotspots of wildfire activity and a large range of wildfire risk to structures in the study area. As expected, the extreme weather conditions yielded higher burn probabilities over the entire landscape, as well as to different land cover classes and individual structures. Moreover, the spatial pattern of risk was significantly different between extreme and normal weather conditions. The results highlight the fact that extreme weather conditions not only produce higher fire risk than normal weather conditions, but also change the fine-scale locations of high risk areas in the landscape, which is of great importance for fire management in WUI areas. In addition, the choice of weather data may limit the potential for comparisons of risk maps for different areas and for extrapolating risk maps to future scenarios where weather conditions are unknown. Our approach to modeling wildfire risk to structures can aid fire risk reduction management activities by identifying areas with elevated wildfire risk and those most vulnerable under extreme weather conditions. ?? 2009 Elsevier B.V.
How Confident can we be in Flood Risk Assessments?
NASA Astrophysics Data System (ADS)
Merz, B.
2017-12-01
Flood risk management should be based on risk analyses quantifying the risk and its reduction for different risk reduction strategies. However, validating risk estimates by comparing model simulations with past observations is hardly possible, since the assessment typically encompasses extreme events and their impacts that have not been observed before. Hence, risk analyses are strongly based on assumptions and expert judgement. This situation opens the door for cognitive biases, such as `illusion of certainty', `overconfidence' or `recency bias'. Such biases operate specifically in complex situations with many factors involved, when uncertainty is high and events are probabilistic, or when close learning feedback loops are missing - aspects that all apply to risk analyses. This contribution discusses how confident we can be in flood risk assessments, and reflects about more rigorous approaches towards their validation.
NASA Technical Reports Server (NTRS)
Ling, Lisa
2014-01-01
For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.
A probabilistic seismic risk assessment procedure for nuclear power plants: (II) Application
Huang, Y.-N.; Whittaker, A.S.; Luco, N.
2011-01-01
This paper presents the procedures and results of intensity- and time-based seismic risk assessments of a sample nuclear power plant (NPP) to demonstrate the risk-assessment methodology proposed in its companion paper. The intensity-based assessments include three sets of sensitivity studies to identify the impact of the following factors on the seismic vulnerability of the sample NPP, namely: (1) the description of fragility curves for primary and secondary components of NPPs, (2) the number of simulations of NPP response required for risk assessment, and (3) the correlation in responses between NPP components. The time-based assessment is performed as a series of intensity-based assessments. The studies illustrate the utility of the response-based fragility curves and the inclusion of the correlation in the responses of NPP components directly in the risk computation. ?? 2011 Published by Elsevier B.V.
Lee, Jin-Jing; Jang, Cheng-Shin; Liang, Ching-Ping; Liu, Chen-Wuing
2008-09-15
This study spatially analyzed potential carcinogenic risks associated with ingesting arsenic (As) contents in aquacultural smeltfish (Plecoglossus altirelis) from the Lanyang Plain of northeastern Taiwan. Sequential indicator simulation (SIS) was adopted to reproduce As exposure distributions in groundwater based on their three-dimensional variability. A target cancer risk (TR) associated with ingesting As in aquacultural smeltfish was employed to evaluate the potential risk to human health. The probabilistic risk assessment determined by Monte Carlo simulation and SIS is used to propagate properly the uncertainty of parameters. Safe and hazardous aquacultural regions were mapped to elucidate the safety of groundwater use. The TRs determined from the risks at the 95th percentiles exceed one millionth, indicating that ingesting smeltfish that are farmed in the highly As-affected regions represents a potential cancer threat to human health. The 95th percentile of TRs is considered in formulating a strategy for the aquacultural use of groundwater in the preliminary stage.
A multimodal assessment of driving performance in HIV infection.
Marcotte, T D; Wolfson, T; Rosenthal, T J; Heaton, R K; Gonzalez, R; Ellis, R J; Grant, I
2004-10-26
To examine if HIV-seropositive (HIV+) individuals are at risk for impaired driving. Sixty licensed drivers (40 HIV+, 20 HIV-) completed a neuropsychological (NP) test battery and driving assessments. Eleven HIV+ subjects were NP-impaired. Driving-related skills were assessed using 1) two driving simulations (examining accident avoidance and navigational abilities), 2) the Useful Field of View (UFOV) test, and 3) an on-road evaluation. HIV+ NP-impaired subjects had greater difficulty than cognitively intact subjects on all driving measures, whereas the HIV- and HIV+ NP-normal groups performed similarly. On the UFOV, the HIV+ NP-impaired group had worse performance on Visual Processing and Divided Attention tasks but not in overall risk classification. They also had a higher number of simulator accidents (1.3 vs 2.0; p = 0.03), were less efficient at completing the navigation task (3.2 vs 9.2 blocks; p = 0.001), and were more likely to fail the on-road evaluation (6 vs 36%; p = 0.02). Impairment in Executive Functioning was the strongest NP predictor of failing the on-road drive test. NP performance and both simulations independently contributed to a model predicting 48% of the variance in on-road performance. HIV+ NP-impaired individuals are at increased risk for on-road driving impairments, whereas HIV+ individuals with normal cognition are not at a significantly higher risk than HIV- subjects. Executive Functioning is most strongly associated with impaired on-road performance. Cognitive and simulator testing may each provide data in identifying driving-impaired individuals.
NASA Technical Reports Server (NTRS)
Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry
2015-01-01
Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.
Risk assessment of storm surge disaster based on numerical models and remote sensing
NASA Astrophysics Data System (ADS)
Liu, Qingrong; Ruan, Chengqing; Zhong, Shan; Li, Jian; Yin, Zhonghui; Lian, Xihu
2018-06-01
Storm surge is one of the most serious ocean disasters in the world. Risk assessment of storm surge disaster for coastal areas has important implications for planning economic development and reducing disaster losses. Based on risk assessment theory, this paper uses coastal hydrological observations, a numerical storm surge model and multi-source remote sensing data, proposes methods for valuing hazard and vulnerability for storm surge and builds a storm surge risk assessment model. Storm surges in different recurrence periods are simulated in numerical models and the flooding areas and depth are calculated, which are used for assessing the hazard of storm surge; remote sensing data and GIS technology are used for extraction of coastal key objects and classification of coastal land use are identified, which is used for vulnerability assessment of storm surge disaster. The storm surge risk assessment model is applied for a typical coastal city, and the result shows the reliability and validity of the risk assessment model. The building and application of storm surge risk assessment model provides some basis reference for the city development plan and strengthens disaster prevention and mitigation.
Wildfire risk in the wildland-urban interface: A simulation study in northwestern Wisconsin
Massada, Avi Bar; Radeloff, Volker C.; Stewart, Susan I.; Hawbaker, Todd J.
2009-01-01
The rapid growth of housing in and near the wildland–urban interface (WUI) increases wildfirerisk to lives and structures. To reduce fire risk, it is necessary to identify WUI housing areas that are more susceptible to wildfire. This is challenging, because wildfire patterns depend on fire behavior and spread, which in turn depend on ignition locations, weather conditions, the spatial arrangement of fuels, and topography. The goal of our study was to assess wildfirerisk to a 60,000 ha WUI area in northwesternWisconsin while accounting for all of these factors. We conducted 6000 simulations with two dynamic fire models: Fire Area Simulator (FARSITE) and Minimum Travel Time (MTT) in order to map the spatial pattern of burn probabilities. Simulations were run under normal and extreme weather conditions to assess the effect of weather on fire spread, burn probability, and risk to structures. The resulting burn probability maps were intersected with maps of structure locations and land cover types. The simulations revealed clear hotspots of wildfire activity and a large range of wildfirerisk to structures in the study area. As expected, the extreme weather conditions yielded higher burn probabilities over the entire landscape, as well as to different land cover classes and individual structures. Moreover, the spatial pattern of risk was significantly different between extreme and normal weather conditions. The results highlight the fact that extreme weather conditions not only produce higher fire risk than normal weather conditions, but also change the fine-scale locations of high risk areas in the landscape, which is of great importance for fire management in WUI areas. In addition, the choice of weather data may limit the potential for comparisons of risk maps for different areas and for extrapolating risk maps to future scenarios where weather conditions are unknown. Our approach to modeling wildfirerisk to structures can aid fire risk reduction management activities by identifying areas with elevated wildfirerisk and those most vulnerable under extreme weather conditions.
Tarafdar, Abhrajyoti; Sinha, Alok
2018-02-26
The total concentrations of 13 detected polycyclic aromatic hydrocarbons (PAHs) in different traffic soil samples of Dhanbad heavy mining area, India, were between 8.256 and 12.562 µg/g and were dominated by four ring PAHs (44%). Diagnostic ratio study revealed that fossil fuel burning and vehicular pollution are the most prominent sources of the PAHs in roadside soil even at a heavy coal mining area. The 90th percentiles cancer risks determined by probabilistic health risk assessment (Monte Carlo simulations) for both the age groups (children and adults) were above tolerable limit (>1.00E-06) according to USEPA. The simulated mean cancer risk was 1.854E-05 for children and 1.823E-05 for adults. For different exposure pathways, dermal contact was observed to be the major pathway with an exposure load of 74% for children and 85% for adults. Sensitivity analysis demonstrated relative skin adherence factor for soil (AF) is the most influential parameter of the simulation, followed by exposure duration (ED).
The US EPA is faced with long lists of chemicals that need to be assessed for hazard, and a gap in evaluating chemical risk is accounting for metabolic activation resulting in increased toxicity. The goals of this project are to develop a capability to predict metabolic maps of x...
2011-01-01
Background Genetic risk models could potentially be useful in identifying high-risk groups for the prevention of complex diseases. We investigated the performance of this risk stratification strategy by examining epidemiological parameters that impact the predictive ability of risk models. Methods We assessed sensitivity, specificity, and positive and negative predictive value for all possible risk thresholds that can define high-risk groups and investigated how these measures depend on the frequency of disease in the population, the frequency of the high-risk group, and the discriminative accuracy of the risk model, as assessed by the area under the receiver-operating characteristic curve (AUC). In a simulation study, we modeled genetic risk scores of 50 genes with equal odds ratios and genotype frequencies, and varied the odds ratios and the disease frequency across scenarios. We also performed a simulation of age-related macular degeneration risk prediction based on published odds ratios and frequencies for six genetic risk variants. Results We show that when the frequency of the high-risk group was lower than the disease frequency, positive predictive value increased with the AUC but sensitivity remained low. When the frequency of the high-risk group was higher than the disease frequency, sensitivity was high but positive predictive value remained low. When both frequencies were equal, both positive predictive value and sensitivity increased with increasing AUC, but higher AUC was needed to maximize both measures. Conclusions The performance of risk stratification is strongly determined by the frequency of the high-risk group relative to the frequency of disease in the population. The identification of high-risk groups with appreciable combinations of sensitivity and positive predictive value requires higher AUC. PMID:21797996
Jabor, A; Vlk, T; Boril, P
1996-04-15
We designed a simulation model for the assessment of the financial risks involved when a new diagnostic test is introduced in the laboratory. The model is based on a neural network consisting of ten neurons and assumes that input entities can have assigned appropriate uncertainty. Simulations are done on a 1-day interval basis. Risk analysis completes the model and the financial effects are evaluated for a selected time period. The basic output of the simulation consists of total expenses and income during the simulation time, net present value of the project at the end of simulation, total number of control samples during simulation, total number of patients evaluated and total number of used kits.
A Risk-Based Framework for Assessing the Effectiveness of Stratospheric Aerosol Geoengineering
Ferraro, Angus J.; Charlton-Perez, Andrew J.; Highwood, Eleanor J.
2014-01-01
Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth's surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth's surface area, containing 36% of its population and 60% of its gross domestic product. PMID:24533155
As decentralized water reuse continues to gain popularity, risk-based treatment guidance is increasingly sought for the protection of public health. However, efforts to evaluate pathogen risks and log-reduction requirements have been hindered by an incomplete understanding of pat...
NASA Astrophysics Data System (ADS)
Chao, Y.; Cheng, C. T.; Hsiao, Y. H.; Hsu, C. T.; Yeh, K. C.; Liu, P. L.
2017-12-01
There are 5.3 typhoons hit Taiwan per year on average in last decade. Typhoon Morakot in 2009, the most severe typhoon, causes huge damage in Taiwan, including 677 casualties and roughly NT 110 billion (3.3 billion USD) in economic loss. Some researches documented that typhoon frequency will decrease but increase in intensity in western North Pacific region. It is usually preferred to use high resolution dynamical model to get better projection of extreme events; because coarse resolution models cannot simulate intense extreme events. Under that consideration, dynamical downscaling climate data was chosen to describe typhoon satisfactorily, this research used the simulation data from AGCM of Meteorological Research Institute (MRI-AGCM). Considering dynamical downscaling methods consume massive computing power, and typhoon number is very limited in a single model simulation, using dynamical downscaling data could cause uncertainty in disaster risk assessment. In order to improve the problem, this research used four sea surfaces temperatures (SSTs) to increase the climate change scenarios under RCP 8.5. In this way, MRI-AGCMs project 191 extreme typhoons in Taiwan (when typhoon center touches 300 km sea area of Taiwan) in late 21th century. SOBEK, a two dimensions flood simulation model, was used to assess the flood risk under four SSTs climate change scenarios in Tainan, Taiwan. The results show the uncertainty of future flood risk assessment is significantly decreased in Tainan, Taiwan in late 21th century. Four SSTs could efficiently improve the problems of limited typhoon numbers in single model simulation.
Mark A. Finney; Charles W. McHugh; Isaac Grenfell; Karin L. Riley
2010-01-01
Components of a quantitative risk assessment were produced by simulation of burn probabilities and fire behavior variation for 134 fire planning units (FPUs) across the continental U.S. The system uses fire growth simulation of ignitions modeled from relationships between large fire occurrence and the fire danger index Energy Release Component (ERC). Simulations of 10,...
Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas
NASA Astrophysics Data System (ADS)
Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.
In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.
Risk/benefit assessment of delayed action concept for rail inspection
DOT National Transportation Integrated Search
1999-02-01
A Monte Carlo simulation of certain aspects of rail inspection is presented. The simulation is used to investigate alternative practices in railroad rail inspection programs. Results are presented to compare the present practice of immediately repair...
Jessica R. Haas; David E. Calkin; Matthew P. Thompson
2013-01-01
Ongoing human development into fire-prone areas contributes to increasing wildfire risk to human life. It is critically important, therefore, to have the ability to characterize wildfire risk to populated places, and to identify geographic areas with relatively high risk. A fundamental component of wildfire risk analysis is establishing the likelihood of wildfire...
Consumer phase risk assessment for Listeria monocytogenes in deli meats.
Yang, Hong; Mokhtari, Amirhossein; Jaykus, Lee-Ann; Morales, Roberta A; Cates, Sheryl C; Cowen, Peter
2006-02-01
The foodborne disease risk associated with the pathogen Listeria monocytogenes has been the subject of recent efforts in quantitative microbial risk assessment. Building upon one of these efforts undertaken jointly by the U.S. Food and Drug Administration and the U.S. Department of Agriculture (USDA), the purpose of this work was to expand on the consumer phase of the risk assessment to focus on handling practices in the home. One-dimensional Monte Carlo simulation was used to model variability in growth and cross-contamination of L. monocytogenes during food storage and preparation of deli meats. Simulations approximated that 0.3% of the servings were contaminated with >10(4) CFU/g of L. monocytogenes at the time of consumption. The estimated mean risk associated with the consumption of deli meats for the intermediate-age population was approximately 7 deaths per 10(11) servings. Food handling in homes increased the estimated mean mortality by 10(6)-fold. Of all the home food-handling practices modeled, inadequate storage, particularly refrigeration temperatures, provided the greatest contribution to increased risk. The impact of cross-contamination in the home was considerably less. Adherence to USDA Food Safety and Inspection Service recommendations for consumer handling of ready-to-eat foods substantially reduces the risk of listeriosis.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Hooper, Russell
2016-11-01
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility.
Mendes, Bruno Melo; Trindade, Bruno Machado; Fonseca, Telma Cristina Ferreira; de Campos, Tarcisio Passos Ribeiro
2017-12-01
The aim of this work was to simulate a 6MV conventional breast 3D conformational radiation therapy (3D-CRT) with physical wedges (50 Gy/25#) in the left breast, calculate the mean absorbed dose in the body organs using robust models and computational tools and estimate the secondary cancer-incidence risk to the Brazilian population. The VW female phantom was used in the simulations. Planning target volume (PTV) was defined in the left breast. The 6MV parallel-opposed fields breast-radiotherapy (RT) protocol was simulated with MCNPx code. The absorbed doses were evaluated in all the organs. The secondary cancer-incidence risk induced by radiotherapy was calculated for different age groups according to the BEIR VII methodology. RT quality indexes indicated that the protocol was properly simulated. Significant absorbed dose values in red bone marrow, RBM (0.8 Gy) and stomach (0.6 Gy) were observed. The contralateral breast presented the highest risk of incidence of a secondary cancer followed by leukaemia, lung and stomach. The risk of a secondary cancer-incidence by breast-RT, for the Brazilian population, ranged between 2.2-1.7% and 0.6-0.4%. RBM and stomach, usually not considered as OAR, presented high second cancer incidence risks of 0.5-0.3% and 0.4-0.1%, respectively. This study may be helpful for breast-RT risk/benefit assessment. Advances in knowledge: MCNPX-dosimetry was able to provide the scatter radiation and dose for all body organs in conventional breast-RT. It was found a relevant risk up to 2.2% of induced-cancer from breast-RT, considering the whole thorax organs and Brazilian cancer-incidence.
NASA Astrophysics Data System (ADS)
Liu, Luyao; Feng, Minquan
2018-03-01
[Objective] This study quantitatively evaluated risk probabilities of sudden water pollution accidents under the influence of risk sources, thus providing an important guarantee for risk source identification during water diversion from the Hanjiang River to the Weihe River. [Methods] The research used Bayesian networks to represent the correlation between accidental risk sources. It also adopted the sequential Monte Carlo algorithm to combine water quality simulation with state simulation of risk sources, thereby determining standard-exceeding probabilities of sudden water pollution accidents. [Results] When the upstream inflow was 138.15 m3/s and the average accident duration was 48 h, the probabilities were 0.0416 and 0.0056 separately. When the upstream inflow was 55.29 m3/s and the average accident duration was 48 h, the probabilities were 0.0225 and 0.0028 separately. [Conclusions] The research conducted a risk assessment on sudden water pollution accidents, thereby providing an important guarantee for the smooth implementation, operation, and water quality of the Hanjiang-to-Weihe River Diversion Project.
Probabilistic Risk Assessment for Bone Fracture - Bone Fracture Risk Module (BFxRM)
NASA Technical Reports Server (NTRS)
Licata, Angelo; Myers, Jerry G.; Lewandowski, Beth
2013-01-01
This presentation summarizes the concepts, development, and application of NASA's Bone Fracture Risk Module (BFxRM). The overview includes an assessmnet of strenghts and limitations of the BFxRM and proposes a numebr of discussion questions to the panel regarding future development avenues for this simulation system.
Large-scale derived flood frequency analysis based on continuous simulation
NASA Astrophysics Data System (ADS)
Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno
2016-04-01
There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several drawbacks reported in traditional approaches for the derived flood frequency analysis and therefore is recommended for large scale flood risk case studies.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
... applications and assessment of site specific, generic, and process-oriented multimedia environmental models as... development and simulation supports interagency interests in risk assessment, uncertainty analyses, management...
Climate change vulnerability for species-Assessing the assessments.
Wheatley, Christopher J; Beale, Colin M; Bradbury, Richard B; Pearce-Higgins, James W; Critchlow, Rob; Thomas, Chris D
2017-09-01
Climate change vulnerability assessments are commonly used to identify species at risk from global climate change, but the wide range of methodologies available makes it difficult for end users, such as conservation practitioners or policymakers, to decide which method to use as a basis for decision-making. In this study, we evaluate whether different assessments consistently assign species to the same risk categories and whether any of the existing methodologies perform well at identifying climate-threatened species. We compare the outputs of 12 climate change vulnerability assessment methodologies, using both real and simulated species, and validate the methods using historic data for British birds and butterflies (i.e. using historical data to assign risks and more recent data for validation). Our results show that the different vulnerability assessment methods are not consistent with one another; different risk categories are assigned for both the real and simulated sets of species. Validation of the different vulnerability assessments suggests that methods incorporating historic trend data into the assessment perform best at predicting distribution trends in subsequent time periods. This study demonstrates that climate change vulnerability assessments should not be used interchangeably due to the poor overall agreement between methods when considering the same species. The results of our validation provide more support for the use of trend-based rather than purely trait-based approaches, although further validation will be required as data become available. © 2017 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
Simulation technology used for risky assessment in deep exploration project in China
NASA Astrophysics Data System (ADS)
jiao, J.; Huang, D.; Liu, J.
2013-12-01
Deep exploration has been carried out in China for five years in which various heavy duty instruments and equipments are employed for gravity, magnetic, seismic and electromagnetic data prospecting as well as ultra deep drilling rig established for obtaining deep samples, and so on. The deep exploration is a large and complex system engineering crossing multiple subjects with great investment. It is necessary to employ advanced technical means technology for verification, appraisal, and optimization of geographical prospecting equipment development. To reduce risk of the application and exploration, efficient and allegeable management concept and skills have to be enhanced in order to consolidate management measure and workflow to benefit the ambitious project. Therefore, evidence, prediction, evaluation and related decision strategies have to be taken into accouter simultaneously to meet practical scientific requests and technique limits and extendable attempts. Simulation technique is then proposed as a tool that can be used to carry out dynamic test on actual or imagined system. In practice, it is necessary to combine the simulation technique with the instruments and equipment to accomplish R&D tasks. In this paper, simulation technique is introduced into the R&D process of heavy-duty equipment and high-end engineering project technology. Based on the information provided by a drilling group recently, a digital model is constructed by combination of geographical data, 3d visualization, database management, and visual reality technologies together. It result in push ahead a R&D strategy, in which data processing , instrument application, expected result and uncertainty, and even operation workflow effect environment atmosphere are simulated systematically or simultaneously, in order to obtain an optimal consequence as well as equipment updating strategy. The simulation technology is able to adjust, verify, appraise and optimize the primary plan due to changing in the real world or process, which can provide new insight to the equipment to meet requests from application and construction process and facilitates by means of direct perception and understanding of installation, debugging and experimental process of key equipment for deep exploration. Finally, the objective of project cost conservation and risk reduction can be reasonably approached. Risk assessment can be used to quantitatively evaluate the possible degree of the impact. During the research and development stage, information from the installation, debugging and simulation demonstration of the experiment process of the key instrument and equipment are used to evaluate the fatigue and safety of the device. It needs fully understanding the controllable and uncontrollable risk factors during the process, and then adjusting and improving the unsafe risk factors in the risk assessment and prediction. With combination with professional Geo software to process and interpret the environment to obtain evaluation parameters, simulation modeling is more likely close to exploration target which need more details of evaluations. From micro and macro comprehensive angles to safety and risk assessment can be achieved to satisfy the purpose of reducing the risk of equipment development, and to avoid unnecessary loss on the way of the development.
Coastal Tsunami and Risk Assessment for Eastern Mediterranean Countries
NASA Astrophysics Data System (ADS)
Kentel, E.; Yavuz, C.
2017-12-01
Tsunamis are rarely experienced events that have enormous potential to cause large economic destruction on the critical infrastructures and facilities, social devastation due to mass casualty, and environmental adverse effects like erosion, accumulation and inundation. Especially for the past two decades, nations have encountered devastating tsunami events. The aim of this study is to investigate risks along the Mediterranean coastline due to probable tsunamis based on simulations using reliable historical data. In order to do this, 50 Critical Regions, CRs, (i.e. city centers, agricultural areas and summer villages) and 43 Critical Infrastructures, CIs, (i.e. airports, ports & marinas and industrial structures) are determined to perform people-centered risk assessment along Eastern Mediterranean region covering 7 countries. These countries include Turkey, Syria, Lebanon, Israel, Egypt, Cyprus, and Libya. Bathymetry of the region is given in Figure 1. In this study, NAMI-DANCE is used to carry out tsunami simulations. Source of a sample tsunami simulation and maximum wave propagation in the study area for this sample tsunami are given in Figures 2 and 3, respectively.Richter magnitude,, focal depth, time of occurrence in a day and season are considered as the independent parameters of the earthquake. Historical earthquakes are used to generate reliable probability distributions for these parameters. Monte Carlo (MC) Simulations are carried out to evaluate overall risks at the coastline. Inundation level, population density, number of passenger or employee, literacy rate, annually income level and existence of human are used in risk estimations. Within each MC simulation and for each grid in the study area, people-centered tsunami risk for each of the following elements at risk is calculated: i. City centers ii. Agricultural areas iii. Summer villages iv. Ports and marinas v. Airports vi. Industrial structures Risk levels at each grid along the shoreline are calculated based on the factors given above, grouped into low, medium and high risk, and used in generating the risk map. The risk map will be useful in prioritizing areas that require development of tsunami mitigation measures.
USDA-ARS?s Scientific Manuscript database
The phosphorus (P) Index (PI) is the risk assessment tool approved in the NRCS 590 standard used to target critical source areas and practices to reduce P losses. A revision of the 590 standard, suggested using the Agricultural Policy/Environmental eXtender (APEX) model to assess the risk of nitroge...
NASA Astrophysics Data System (ADS)
Acierto, R. A. E.; Kawasaki, A.
2017-12-01
Perennial flooding due to heavy rainfall events causes strong impacts on the society and economy. With increasing pressures of rapid development and potential for climate change impacts, Myanmar experiences a rapid increase in disaster risk. Heavy rainfall hazard assessment is key on quantifying such disaster risk in both current and future conditions. Downscaling using Regional Climate Models (RCM) such as Weather Research and Forecast model have been used extensively for assessing such heavy rainfall events. However, usage of convective parameterizations can introduce large errors in simulating rainfall. Convective-permitting simulations have been used to deal with this problem by increasing the resolution of RCMs to 4km. This study focuses on the heavy rainfall events during the six-year (2010-2015) wet period season from May to September in Myanmar. The investigation primarily utilizes rain gauge observation for comparing downscaled heavy rainfall events in 4km resolution using ERA-Interim as boundary conditions using 12km-4km one-way nesting method. The study aims to provide basis for production of high-resolution climate projections over Myanmar in order to contribute for flood hazard and risk assessment.
Ares-I-X Vehicle Preliminary Range Safety Malfunction Turn Analysis
NASA Technical Reports Server (NTRS)
Beaty, James R.; Starr, Brett R.; Gowan, John W., Jr.
2008-01-01
Ares-I-X is the designation given to the flight test version of the Ares-I rocket (also known as the Crew Launch Vehicle - CLV) being developed by NASA. As part of the preliminary flight plan approval process for the test vehicle, a range safety malfunction turn analysis was performed to support the launch area risk assessment and vehicle destruct criteria development processes. Several vehicle failure scenarios were identified which could cause the vehicle trajectory to deviate from its normal flight path, and the effects of these failures were evaluated with an Ares-I-X 6 degrees-of-freedom (6-DOF) digital simulation, using the Program to Optimize Simulated Trajectories Version 2 (POST2) simulation framework. The Ares-I-X simulation analysis provides output files containing vehicle state information, which are used by other risk assessment and vehicle debris trajectory simulation tools to determine the risk to personnel and facilities in the vicinity of the launch area at Kennedy Space Center (KSC), and to develop the vehicle destruct criteria used by the flight test range safety officer. The simulation analysis approach used for this study is described, including descriptions of the failure modes which were considered and the underlying assumptions and ground rules of the study, and preliminary results are presented, determined by analysis of the trajectory deviation of the failure cases, compared with the expected vehicle trajectory.
COMPUTATIONAL TOXICOLOGY: AN APPROACH FOR PRIORITIZING CHEMICAL RISK ASSESSMENTS
Characterizing toxic effects for industrial chemicals carries the challenge of focusing resources on the greatest potential risks for human health and the environment. The union of molecular modeling, bioinformatics and simulation of complex systems with emerging technologies suc...
Innovative neuro-fuzzy system of smart transport infrastructure for road traffic safety
NASA Astrophysics Data System (ADS)
Beinarovica, Anna; Gorobetz, Mikhail; Levchenkov, Anatoly
2017-09-01
The proposed study describes applying of neural network and fuzzy logic in transport control for safety improvement by evaluation of accidents’ risk by intelligent infrastructure devices. Risk evaluation is made by following multiple-criteria: danger, changeability and influence of changes for risk increasing. Neuro-fuzzy algorithms are described and proposed for task solution. The novelty of the proposed system is proved by deep analysis of known studies in the field. The structure of neuro-fuzzy system for risk evaluation and mathematical model is described in the paper. The simulation model of the intelligent devices for transport infrastructure is proposed to simulate different situations, assess the risks and propose the possible actions for infrastructure or vehicles to minimize the risk of possible accidents.
Network Security Risk Assessment System Based on Attack Graph and Markov Chain
NASA Astrophysics Data System (ADS)
Sun, Fuxiong; Pi, Juntao; Lv, Jin; Cao, Tian
2017-10-01
Network security risk assessment technology can be found in advance of the network problems and related vulnerabilities, it has become an important means to solve the problem of network security. Based on attack graph and Markov chain, this paper provides a Network Security Risk Assessment Model (NSRAM). Based on the network infiltration tests, NSRAM generates the attack graph by the breadth traversal algorithm. Combines with the international standard CVSS, the attack probability of atomic nodes are counted, and then the attack transition probabilities of ones are calculated by Markov chain. NSRAM selects the optimal attack path after comprehensive measurement to assessment network security risk. The simulation results show that NSRAM can reflect the actual situation of network security objectively.
Pawar, Rajesh; Bromhal, Grant; Carroll, Susan; ...
2014-12-31
Risk assessment for geologic CO₂ storage including quantification of risks is an area of active investigation. The National Risk Assessment Partnership (NRAP) is a US-Department of Energy (US-DOE) effort focused on developing a defensible, science-based methodology and platform for quantifying risk profiles at geologic CO₂ sequestration sites. NRAP has been developing a methodology that centers round development of an integrated assessment model (IAM) using system modeling approach to quantify risks and risk profiles. The IAM has been used to calculate risk profiles with a few key potential impacts due to potential CO₂ and brine leakage. The simulation results are alsomore » used to determine long-term storage security relationships and compare the long-term storage effectiveness to IPCC storage permanence goal. Additionally, we also demonstrate application of IAM for uncertainty quantification in order to determine parameters to which the uncertainty in model results is most sensitive.« less
Guo, Lei; Li, Zhengyan; Gao, Pei; Hu, Hong; Gibson, Mark
2015-11-01
Bisphenol A (BPA) occurs widely in natural waters with both traditional and reproductive toxicity to various aquatic species. The water quality criteria (WQC), however, have not been established in China, which hinders the ecological risk assessment for the pollutant. This study therefore aims to derive the water quality criteria for BPA based on both acute and chronic toxicity endpoints and to assess the ecological risk in surface waters of China. A total of 15 acute toxicity values tested with aquatic species resident in China were found in published literature, which were simulated with the species sensitivity distribution (SSD) model for the derivation of criterion maximum concentration (CMC). 18 chronic toxicity values with traditional endpoints were simulated for the derivation of traditional criterion continuous concentration (CCC) and 12 chronic toxicity values with reproductive endpoints were for reproductive CCC. Based on the derived WQC, the ecological risk of BPA in surface waters of China was assessed with risk quotient (RQ) method. The results showed that the CMC, traditional CCC and reproductive CCC were 1518μgL(-1), 2.19μgL(-1) and 0.86μgL(-1), respectively. The acute risk of BPA was negligible with RQ values much lower than 0.1. The chronic risk was however much higher with RQ values of between 0.01-3.76 and 0.03-9.57 based on traditional and reproductive CCC, respectively. The chronic RQ values on reproductive endpoints were about threefold as high as those on traditional endpoints, indicating that ecological risk assessment based on traditional effects may not guarantee the safety of aquatic biota. Copyright © 2015 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-09
... assessments of site specific, generic, and process-oriented multimedia environmental models as they pertain to human and environmental health risk assessment. Multimedia model development and simulation supports...
The development of a simulation model of primary prevention strategies for coronary heart disease.
Babad, Hannah; Sanderson, Colin; Naidoo, Bhash; White, Ian; Wang, Duolao
2002-11-01
This paper describes the present state of development of a discrete-event micro-simulation model for coronary heart disease prevention. The model is intended to support health policy makers in assessing the impacts on health care resources of different primary prevention strategies. For each person, a set of times to disease events, conditional on the individual's risk factor profile, is sampled from a set of probability distributions that are derived from a new analysis of the Framingham cohort study on coronary heart disease. Methods used to model changes in behavioural and physiological risk factors are discussed and a description of the simulation logic is given. The model incorporates POST (Patient Oriented Simulation Technique) simulation routines.
Research in Modeling and Simulation for Airspace Systems Innovation
NASA Technical Reports Server (NTRS)
Ballin, Mark G.; Kimmel, William M.; Welch, Sharon S.
2007-01-01
This viewgraph presentation provides an overview of some of the applied research and simulation methodologies at the NASA Langley Research Center that support aerospace systems innovation. Risk assessment methodologies, complex systems design and analysis methodologies, and aer ospace operations simulations are described. Potential areas for future research and collaboration using interactive and distributed simula tions are also proposed.
Risk assessment of flood disaster and forewarning model at different spatial-temporal scales
NASA Astrophysics Data System (ADS)
Zhao, Jun; Jin, Juliang; Xu, Jinchao; Guo, Qizhong; Hang, Qingfeng; Chen, Yaqian
2018-05-01
Aiming at reducing losses from flood disaster, risk assessment of flood disaster and forewarning model is studied. The model is built upon risk indices in flood disaster system, proceeding from the whole structure and its parts at different spatial-temporal scales. In this study, on the one hand, it mainly establishes the long-term forewarning model for the surface area with three levels of prediction, evaluation, and forewarning. The method of structure-adaptive back-propagation neural network on peak identification is used to simulate indices in prediction sub-model. Set pair analysis is employed to calculate the connection degrees of a single index, comprehensive index, and systematic risk through the multivariate connection number, and the comprehensive assessment is made by assessment matrixes in evaluation sub-model. The comparison judging method is adopted to divide warning degree of flood disaster on risk assessment comprehensive index with forewarning standards in forewarning sub-model and then the long-term local conditions for proposing planning schemes. On the other hand, it mainly sets up the real-time forewarning model for the spot, which introduces the real-time correction technique of Kalman filter based on hydrological model with forewarning index, and then the real-time local conditions for presenting an emergency plan. This study takes Tunxi area, Huangshan City of China, as an example. After risk assessment and forewarning model establishment and application for flood disaster at different spatial-temporal scales between the actual and simulated data from 1989 to 2008, forewarning results show that the development trend for flood disaster risk remains a decline on the whole from 2009 to 2013, despite the rise in 2011. At the macroscopic level, project and non-project measures are advanced, while at the microcosmic level, the time, place, and method are listed. It suggests that the proposed model is feasible with theory and application, thus offering a way for assessing and forewarning flood disaster risk.
Technology Development Risk Assessment for Space Transportation Systems
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Godsell, Aga M.; Go, Susie
2006-01-01
A new approach for assessing development risk associated with technology development projects is presented. The method represents technology evolution in terms of sector-specific discrete development stages. A Monte Carlo simulation is used to generate development probability distributions based on statistical models of the discrete transitions. Development risk is derived from the resulting probability distributions and specific program requirements. Two sample cases are discussed to illustrate the approach, a single rocket engine development and a three-technology space transportation portfolio.
Responses of a constructed plant community to simulated glyphosate and dicamba drift
Background/Questions/Methods As part of its regulation of pesticides, the US Environmental Protection Agency must consider environmental risks, including impacts to nontarget plants exposed to pesticide drift. Normally these risk assessments consider impacts to individual spec...
NASA Astrophysics Data System (ADS)
Takarada, S.
2012-12-01
The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in vent position, volume, eruption rate, wind directions and topography. Therefore, numerical simulations with controlled parameters are needed for more precise volcanic eruption predictions. The use of the next-generation system should enable the visualization of past volcanic eruptions datasets such as distributions, eruption volumes and eruption rates, on maps and diagrams using timeline and GIS technology. Similar volcanic eruptions scenarios should be easily searchable from the eruption database. Using the volcano hazard assessment system, prediction of the time and area that would be affected by volcanic eruptions at any locations near the volcano should be possible, using numerical simulations. The system should estimate volcanic hazard risks by overlaying the distributions of volcanic deposits on major roads, houses and evacuation areas using a GIS enabled systems. Probabilistic volcanic hazards maps in active volcano sites should be made based on numerous numerical simulations. The next-generation real-time hazard assessment system would be implemented with user-friendly interface, making the risk assessment system easily usable and accessible online.
Matthew P. Thompson; Joe Scott; Paul G. Langowski; Julie W. Gilbertson-Day; Jessica R. Haas; Elise M. Bowne
2013-01-01
Wildfires can cause significant negative impacts to water quality with resultant consequences for the environment and human health and safety, as well as incurring substantial rehabilitation and water treatment costs. In this paper we will illustrate how state-of-the-art wildfire simulation modeling and geospatial risk assessment methods can be brought to bear to...
Chen, Keping; Blong, Russell; Jacobson, Carol
2003-04-01
This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.
Simulation Assisted Risk Assessment: Blast Overpressure Modeling
NASA Technical Reports Server (NTRS)
Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael
2006-01-01
A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.
Reduced order models for assessing CO 2 impacts in shallow unconfined aquifers
Keating, Elizabeth H.; Harp, Dylan H.; Dai, Zhenxue; ...
2016-01-28
Risk assessment studies of potential CO 2 sequestration projects consider many factors, including the possibility of brine and/or CO 2 leakage from the storage reservoir. Detailed multiphase reactive transport simulations have been developed to predict the impact of such leaks on shallow groundwater quality; however, these simulations are computationally expensive and thus difficult to directly embed in a probabilistic risk assessment analysis. Here we present a process for developing computationally fast reduced-order models which emulate key features of the more detailed reactive transport simulations. A large ensemble of simulations that take into account uncertainty in aquifer characteristics and CO 2/brinemore » leakage scenarios were performed. Twelve simulation outputs of interest were used to develop response surfaces (RSs) using a MARS (multivariate adaptive regression splines) algorithm (Milborrow, 2015). A key part of this study is to compare different measures of ROM accuracy. We then show that for some computed outputs, MARS performs very well in matching the simulation data. The capability of the RS to predict simulation outputs for parameter combinations not used in RS development was tested using cross-validation. Again, for some outputs, these results were quite good. For other outputs, however, the method performs relatively poorly. Performance was best for predicting the volume of depressed-pH-plumes, and was relatively poor for predicting organic and trace metal plume volumes. We believe several factors, including the non-linearity of the problem, complexity of the geochemistry, and granularity in the simulation results, contribute to this varied performance. The reduced order models were developed principally to be used in probabilistic performance analysis where a large range of scenarios are considered and ensemble performance is calculated. We demonstrate that they effectively predict the ensemble behavior. But, the performance of the RSs is much less accurate when used to predict time-varying outputs from a single simulation. If an analysis requires only a small number of scenarios to be investigated, computationally expensive physics-based simulations would likely provide more reliable results. Finally, if the aggregate behavior of a large number of realizations is the focus, as will be the case in probabilistic quantitative risk assessment, the methodology presented here is relatively robust.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayland, M.; Casey, R.; Woodsworth, E.
In this article, we present the results of a dietary-based assessment of the risk that selenium may pose to two aquatic bird species, the American Dipper (Cinclus mexicanus) and the Harlequin Duck (Histrionicus histrionicus), on one of the coal mine-affected streams, the Gregg River. The study consisted of (1) a literature-based toxicity assessment, (2) simulation of selenium exposure in the diets and eggs of the two species, and (3) a risk assessment that coupled information on toxicity and exposure. Diet and egg selenium concentrations associated with a 20% hatch failure rate were 6.4 and 17 {mu} g {center_dot} g{sup -1}more » dry wt, respectively. Simulated dietary selenium concentrations were about 2.0-2.5 {mu} g {center_dot} g{sup -1} higher on the Gregg River than on reference streams for both species. When simulated dietary concentrations were considered, hatch failure rates on the Gregg River were predicted to average 12% higher in American Dippers and 8% higher in Harlequin Ducks than at reference streams. Corresponding values were only 3% for both species when predicted egg concentrations were used. Elevated levels of selenium in insects in some of the reference streams were unexpected and raised a question as to whether aquatic birds have evolved a higher tolerance level for dietary selenium in these areas.« less
Risk Assessment Techniques. A Handbook for Program Management Personnel
1983-07-01
tion; not directly usable without further development. 37. Lieber, R.S., "New Approaches for Quantifying Risk and Determining Sharing Arrangements...must be provided. Prediction intervals around cost estimating relationships (CERs) or Monte Carlo simulations will be used as proper in quantifying ... risk ." [emphasis supplied] Para 9.d. "The ISR will address the potential risk in the program office estimate by identifying ’risk’ areas and their
Application of an Integrated Assessment Model to the Kevin Dome site, Montana
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Minh; Zhang, Ye; Carey, James William
The objectives of the Integrated Assessment Model is to enable the Fault Swarm algorithm in the National Risk Assessment Partnership, ensure faults are working in the NRAP-IAM tool, calculate hypothetical fault leakage in NRAP-IAM, and compare leakage rates to Eclipse simulations.
Tromp, S O; Rijgersberg, H; Franz, E
2010-10-01
Quantitative microbial risk assessments do not usually account for the planning and ordering mechanisms (logistics) of a food supply chain. These mechanisms and consumer demand determine the storage and delay times of products. The aim of this study was to quantitatively assess the difference between simulating supply chain logistics (MOD) and assuming fixed storage times (FIX) in microbial risk estimation for the supply chain of fresh-cut leafy green vegetables destined for working-canteen salad bars. The results of the FIX model were previously published (E. Franz, S. O. Tromp, H. Rijgersberg, and H. J. van der Fels-Klerx, J. Food Prot. 73:274-285, 2010). Pathogen growth was modeled using stochastic discrete-event simulation of the applied logistics concept. The public health effects were assessed by conducting an exposure assessment and risk characterization. The relative growths of Escherichia coli O157 (17%) and Salmonella enterica (15%) were identical in the MOD and FIX models. In contrast, the relative growth of Listeria monocytogenes was considerably higher in the MOD model (1,156%) than in the FIX model (194%). The probability of L. monocytogenes infection in The Netherlands was higher in the MOD model (5.18×10(-8)) than in the FIX model (1.23×10(-8)). The risk of listeriosis-induced fetal mortality in the perinatal population increased from 1.24×10(-4) (FIX) to 1.66×10(-4) (MOD). Modeling the probabilistic nature of supply chain logistics is of additional value for microbial risk assessments regarding psychrotrophic pathogens in food products for which time and temperature are the postharvest preventive measures in guaranteeing food safety.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Mark J; Efroymson, Rebecca Ann; Hargrove, William Walter
A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the tracked vehicle movement component of the testing program. The principal stressor associated with tracked vehicle movement was soil disturbance, and a resulting, secondary stressor was hydrological change. Water loss to washes and wash vegetation was expected to result from increased infiltration and/or evaporation associated with disturbancesmore » to desert pavement. The simulated exposure of wash vegetation to water loss was quantified using estimates of exposed land area from a digital ortho quarter quad aerial photo and field observations, a 30 30 m digital elevation model, the flow accumulation feature of ESRI ArcInfo, and a two-step process in which runoff was estimated from direct precipitation to a land area and from water that flowed from upgradient to a land area. In all simulated scenarios, absolute water loss decreased with distance from the disturbance, downgradient in the washes; however, percentage water loss was greatest in land areas immediately downgradient of a disturbance. Potential effects on growth and survival of wash trees were quantified by using an empirical relationship derived from a local unpublished study of water infiltration rates. The risk characterization concluded that neither risk to wash vegetation growth or survival nor risk to mule deer abundance and reproduction was expected. The risk characterization was negative for both the incremental risk of the test program and the combination of the test and pretest disturbances.« less
Assessing Climate Change Impacts on Wildfire Exposure in Mediterranean Areas.
Lozano, Olga M; Salis, Michele; Ager, Alan A; Arca, Bachisio; Alcasena, Fermin J; Monteiro, Antonio T; Finney, Mark A; Del Giudice, Liliana; Scoccimarro, Enrico; Spano, Donatella
2017-10-01
We used simulation modeling to assess potential climate change impacts on wildfire exposure in Italy and Corsica (France). Weather data were obtained from a regional climate model for the period 1981-2070 using the IPCC A1B emissions scenario. Wildfire simulations were performed with the minimum travel time fire spread algorithm using predicted fuel moisture, wind speed, and wind direction to simulate expected changes in weather for three climatic periods (1981-2010, 2011-2040, and 2041-2070). Overall, the wildfire simulations showed very slight changes in flame length, while other outputs such as burn probability and fire size increased significantly in the second future period (2041-2070), especially in the southern portion of the study area. The projected changes fuel moisture could result in a lengthening of the fire season for the entire study area. This work represents the first application in Europe of a methodology based on high resolution (250 m) landscape wildfire modeling to assess potential impacts of climate changes on wildfire exposure at a national scale. The findings can provide information and support in wildfire management planning and fire risk mitigation activities. © 2016 Society for Risk Analysis.
Tremblay, Mathieu; Gallant, François; Lavallière, Martin; Chiasson, Martine; Silvey, Dustin; Behm, David; Albert, Wayne J; Johnson, Michel J
2015-01-01
Young drivers are overrepresented in collisions resulting in fatalities. It is not uncommon for young drivers to socially binge drink and decide to drive a vehicle a few hours after consumption. To better understand the risks that may be associated with this behaviour, the present study has examined the effects of a social drinking bout followed by a simulated drive in undergraduate students on the descending limb of their BAC (blood alcohol concentration) curve. Two groups of eight undergraduate students (n = 16) took part in this study. Participants in the alcohol group were assessed before drinking, then at moderate and low BAC as well as 24 hours post-acute consumption. This group consumed an average of 5.3 ± 1.4 (mean ± SD) drinks in an hour in a social context and were then submitted to a driving and a predicted crash risk assessment. The control group was assessed at the same time points without alcohol intake or social context.; at 8 a.m., noon, 3 p.m. and 8 a.m. the next morning. These multiple time points were used to measure any potential learning effects from the assessment tools (i.e. driving simulator and useful field of view test (UFOV)). Diminished driving performance at moderate BAC was observed with no increases in predicted crash risk. Moderate correlations between driving variables were observed. No association exists between driving variables and UFOV variables. The control group improved measures of selective attention after the third assessment. No learning effect was observed from multiple sessions with the driving simulator. Our results show that a moderate BAC, although legal, increases the risky behaviour. Effects of alcohol expectancy could have been displayed by the experimental group. UFOV measures and predicted crash risk categories were not sensitive enough to predict crash risk for young drivers, even when intoxicated.
Adding the Human Element to Ship Manoeuvring Simulations
NASA Astrophysics Data System (ADS)
Aarsæther, Karl Gunnar; Moan, Torgeir
Time-domain simulation of ship manoeuvring has been utilized in risk analysis to assess the effect of changes to the ship-lane, development in traffic volume and the associated risk. The process of ship manoeuvring in a wider socio-technical context consists of the technical systems, operational procedures, the human operators and support functions. Automated manoeuvring simulations without human operators in the simulation loop have often been preferred in simulation studies due to the low time required for simulations. Automatic control has represented the human element with little effort devoted to explain the relationship between the guidance and control algorithms and the human operator which they replace. This paper describes the development and application of a model for the human element for autonomous time-domain manoeuvring simulations. The method is applicable in the time-domain, modular and found to be capable of reproducing observed manoeuvre patterns, but limited to represent the intended behaviour.
Model-based risk assessment and public health analysis to prevent Lyme disease
Sabounchi, Nasim S.; Roome, Amanda; Spathis, Rita; Garruto, Ralph M.
2017-01-01
The number of Lyme disease (LD) cases in the northeastern United States has been dramatically increasing with over 300 000 new cases each year. This is due to numerous factors interacting over time including low public awareness of LD, risk behaviours and clothing choices, ecological and climatic factors, an increase in rodents within ecologically fragmented peri-urban built environments and an increase in tick density and infectivity in such environments. We have used a system dynamics (SD) approach to develop a simulation tool to evaluate the significance of risk factors in replicating historical trends of LD cases, and to investigate the influence of different interventions, such as increasing awareness, controlling clothing risk and reducing mouse populations, in reducing LD risk. The model accurately replicates historical trends of LD cases. Among several interventions tested using the simulation model, increasing public awareness most significantly reduces the number of LD cases. This model provides recommendations for LD prevention, including further educational programmes to raise awareness and control behavioural risk. This model has the potential to be used by the public health community to assess the risk of exposure to LD. PMID:29291075
Takegawa, Ryosuke; Ohnishi, Mitsuo; Hirose, Tomoya; Hatano, Yayoi; Imada, Yuko; Endo, Yoko; Shimazu, Takeshi
2016-03-01
In cases of transport by rescue helicopter or ambulance of patients having ingested hazardous substances, medical personnel may be at a certain risk of inhaling the substances. However, few reports have addressed such risk of causing secondary casualties. This simulation study aimed to assess the risk of inhalation of hydrogen sulfide and chlo-opicrin in the cabin of a helicopter or an ambulance transporting a patient who has ingested calcium polysulfide or chloropicrin, which were previously reported to cause secondary casualties. Concentrations of hydrogen sulfide and chloropicrin were assessed on the following as-umptions :The patient ingested 100 mL of the causative or original chemical. All chemical substances reacted with the gastric juice or were thoroughly vomited and evaporated uniformly within the cabin space of the helicopter or ambulance. Environmental conditions were 20 *degrees at 1 atmosphere of pres-ure in a 5 m3 cabin volume in the helicopter and a 13.5 m3 cabin volume in the ambulance. In the case of calcium polysulfide ingestion which produced hydrogen sulfide, its concen-ration reached 774 ppm in the helicopter and 287 ppm in the ambulance. For chloropicrin ingestion, the concentrations were 4,824 ppm and 1,787 ppm, respectively. The simulated concentration of hydrogen sulfide was more than 500 ppm in the heli-opter, which may lead to respiratory paralysis and death. The simulated concentration of chloropicrin was more than 300 ppm, which has a risk of death within 10 minutes. Currently, as far as Japanese laws are concerned, there are no restrictions requiring pretransport assessment or setting criteria for transporting patients who might have ingested hazardous substances that could cause secondary casu-lties when vomited. When patients who might have ingested hazardous chemicals are transported, it is important to recognize the risk of causing secondary casualties by vomiting the chemicals.
We used quantitative microbial risk assessment (QMRA) to estimate the risk of gastrointestinal (GI) illness associated with swimming in recreational waters containing different concentrations of human-associated fecal qPCR markers from raw sewage– HF183 and HumM2. The volume/volu...
Dynamic building risk assessment theoretic model for rainstorm-flood utilization ABM and ABS
NASA Astrophysics Data System (ADS)
Lai, Wenze; Li, Wenbo; Wang, Hailei; Huang, Yingliang; Wu, Xuelian; Sun, Bingyun
2015-12-01
Flood is one of natural disasters with the worst loss in the world. It needs to assess flood disaster risk so that we can reduce the loss of flood disaster. Disaster management practical work needs the dynamic risk results of building. Rainstorm flood disaster system is a typical complex system. From the view of complex system theory, flood disaster risk is the interaction result of hazard effect objects, rainstorm flood hazard factors, and hazard environments. Agent-based modeling (ABM) is an important tool for complex system modeling. Rainstorm-flood building risk dynamic assessment method (RFBRDAM) was proposed using ABM in this paper. The interior structures and procedures of different agents in proposed meth had been designed. On the Netlogo platform, the proposed method was implemented to assess the building risk changes of the rainstorm flood disaster in the Huaihe River Basin using Agent-based simulation (ABS). The results indicated that the proposed method can dynamically assess building risk of the whole process for the rainstorm flood disaster. The results of this paper can provide one new approach for flood disaster building risk dynamic assessment and flood disaster management.
Assessment of ecologic regression in the study of lung cancer and indoor radon.
Stidley, C A; Samet, J M
1994-02-01
Ecologic regression studies conducted to assess the cancer risk of indoor radon to the general population are subject to methodological limitations, and they have given seemingly contradictory results. The authors use simulations to examine the effects of two major methodological problems that affect these studies: measurement error and misspecification of the risk model. In a simulation study of the effect of measurement error caused by the sampling process used to estimate radon exposure for a geographic unit, both the effect of radon and the standard error of the effect estimate were underestimated, with greater bias for smaller sample sizes. In another simulation study, which addressed the consequences of uncontrolled confounding by cigarette smoking, even small negative correlations between county geometric mean annual radon exposure and the proportion of smokers resulted in negative average estimates of the radon effect. A third study considered consequences of using simple linear ecologic models when the true underlying model relation between lung cancer and radon exposure is nonlinear. These examples quantify potential biases and demonstrate the limitations of estimating risks from ecologic studies of lung cancer and indoor radon.
Program Manager Assessments: Professionalism Personified
2015-08-01
dozens of legacy systems. A few years ago, the idea of modernizing this collection in a “ big bang ” approach was rejected in favor of a lower-risk and...chain of com- mand. The assessments are simul - taneously sent to me, the Service or Component acquisition executive, and the program executive officer...use of actual test results at sub-scale, com- ponent testing, modeling, simulation , and field testing were all described in fair detail. Key near
Ares I-X Malfunction Turn Range Safety Analysis
NASA Technical Reports Server (NTRS)
Beaty, J. R.
2011-01-01
Ares I-X was the designation given to the flight test version of the Ares I rocket which was developed by NASA (also known as the Crew Launch Vehicle (CLV) component of the Constellation Program). The Ares I-X flight test vehicle achieved a successful flight test on October 28, 2009, from Pad LC-39B at Kennedy Space Center, Florida (KSC). As part of the flight plan approval for the test vehicle, a range safety malfunction turn analysis was performed to support the risk assessment and vehicle destruct criteria development processes. Several vehicle failure scenarios were identified which could have caused the vehicle trajectory to deviate from its normal flight path. The effects of these failures were evaluated with an Ares I-X 6 degrees-of-freedom (6-DOF) digital simulation, using the Program to Optimize Simulated Trajectories Version II (POST2) simulation tool. The Ares I-X simulation analysis provided output files containing vehicle trajectory state information. These were used by other risk assessment and vehicle debris trajectory simulation tools to determine the risk to personnel and facilities in the vicinity of the launch area at KSC, and to develop the vehicle destruct criteria used by the flight test range safety officer in the event of a flight test anomaly of the vehicle. The simulation analysis approach used for this study is described, including descriptions of the failure modes which were considered and the underlying assumptions and ground rules of the study.
Multi-hazard risk analysis related to hurricanes
NASA Astrophysics Data System (ADS)
Lin, Ning
Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, Clifford Kuofei
Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less
Pang, Shulan; Schwebel, David C.
2016-01-01
Objective Unintentional drowning is the most common cause of childhood death in rural China. Global intervention efforts offer mixed results regarding the efficacy of educational programs. Methods Using a randomized controlled design, we evaluated a testimonial-based intervention to reduce drowning risk among 280 3rd- and 4th-grade rural Chinese children. Children were randomly assigned to view either testimonials on drowning risk (intervention) or dog-bite risk (control). Safety knowledge and perceived vulnerability were measured by self-report questionnaires, and simulated behaviors in and near water were assessed with a culturally appropriate dollhouse task. Results Children in the intervention group had improved children’s safety knowledge and simulated behaviors but not perceived vulnerability compared with controls. Conclusions The testimonial-based intervention’s efficacy appears promising, as it improved safety knowledge and simulated risk behaviors with water among rural Chinese children. PMID:26546476
Real-time 3D radiation risk assessment supporting simulation of work in nuclear environments.
Szőke, I; Louka, M N; Bryntesen, T R; Bratteli, J; Edvardsen, S T; RøEitrheim, K K; Bodor, K
2014-06-01
This paper describes the latest developments at the Institute for Energy Technology (IFE) in Norway, in the field of real-time 3D (three-dimensional) radiation risk assessment for the support of work simulation in nuclear environments. 3D computer simulation can greatly facilitate efficient work planning, briefing, and training of workers. It can also support communication within and between work teams, and with advisors, regulators, the media and public, at all the stages of a nuclear installation's lifecycle. Furthermore, it is also a beneficial tool for reviewing current work practices in order to identify possible gaps in procedures, as well as to support the updating of international recommendations, dissemination of experience, and education of the current and future generation of workers.IFE has been involved in research and development into the application of 3D computer simulation and virtual reality (VR) technology to support work in radiological environments in the nuclear sector since the mid 1990s. During this process, two significant software tools have been developed, the VRdose system and the Halden Planner, and a number of publications have been produced to contribute to improving the safety culture in the nuclear industry.This paper describes the radiation risk assessment techniques applied in earlier versions of the VRdose system and the Halden Planner, for visualising radiation fields and calculating dose, and presents new developments towards implementing a flexible and up-to-date dosimetric package in these 3D software tools, based on new developments in the field of radiation protection. The latest versions of these 3D tools are capable of more accurate risk estimation, permit more flexibility via a range of user choices, and are applicable to a wider range of irradiation situations than their predecessors.
A spatial approach to environmental risk assessment of PAH contamination.
Bengtsson, Göran; Törneman, Niklas
2009-01-01
The extent of remediation of contaminated industrial sites depends on spatial heterogeneity of contaminant concentration and spatially explicit risk characterization. We used sequential Gaussian simulation (SGS) and indicator kriging (IK) to describe the spatial distribution of polycyclic aromatic hydrocarbons (PAHs), pH, electric conductivity, particle aggregate distribution, water holding capacity, and total organic carbon, and quantitative relations among them, in a creosote polluted soil in southern Sweden. The geostatistical analyses were combined with risk analyses, in which the total toxic equivalent concentration of the PAH mixture was calculated from the soil concentrations of individual PAHs and compared with ecotoxicological effect concentrations and regulatory threshold values in block sizes of 1.8 x 1.8 m. Most PAHs were spatially autocorrelated and appeared in several hot spots. The risk calculated by SGS was more confined to specific hot spot areas than the risk calculated by IK, and 40-50% of the site had PAH concentrations exceeding the threshold values with a probability of 80% and higher. The toxic equivalent concentration of the PAH mixture was dependent on the spatial distribution of organic carbon, showing the importance of assessing risk by a combination of measurements of PAH and organic carbon concentrations. Essentially, the same risk distribution pattern was maintained when Monte Carlo simulations were used for implementation of risk in larger (5 x 5 m), economically more feasible remediation blocks, but a smaller area became of great concern for remediation when the simulations included PAH partitioning to two separate sources, creosote and natural, of organic matter, rather than one general.
Progress in virtual reality simulators for surgical training and certification.
de Visser, Hans; Watson, Marcus O; Salvado, Olivier; Passenger, Joshua D
2011-02-21
There is increasing evidence that educating trainee surgeons by simulation is preferable to traditional operating-room training methods with actual patients. Apart from reducing costs and risks to patients, training by simulation can provide some unique benefits, such as greater control over the training procedure and more easily defined metrics for assessing proficiency. Virtual reality (VR) simulators are now playing an increasing role in surgical training. However, currently available VR simulators lack the fidelity to teach trainees past the novice-to-intermediate skills level. Recent technological developments in other industries using simulation, such as the games and entertainment and aviation industries, suggest that the next generation of VR simulators should be suitable for training, maintenance and certification of advanced surgical skills. To be effective as an advanced surgical training and assessment tool, VR simulation needs to provide adequate and relevant levels of physical realism, case complexity and performance assessment. Proper validation of VR simulators and an increased appreciation of their value by the medical profession are crucial for them to be accepted into surgical training curricula.
Henderson, Steven; Woods-Fry, Heather; Collin, Charles A; Gagnon, Sylvain; Voloaca, Misha; Grant, John; Rosenthal, Ted; Allen, Wade
2015-05-01
Our research group has previously demonstrated that the peripheral motion contrast threshold (PMCT) test predicts older drivers' self-report accident risk, as well as simulated driving performance. However, the PMCT is too lengthy to be a part of a battery of tests to assess fitness to drive. Therefore, we have developed a new version of this test, which takes under two minutes to administer. We assessed the motion contrast thresholds of 24 younger drivers (19-32) and 25 older drivers (65-83) with both the PMCT-10min and the PMCT-2min test and investigated if thresholds were associated with measures of simulated driving performance. Younger participants had significantly lower motion contrast thresholds than older participants and there were no significant correlations between younger participants' thresholds and any measures of driving performance. The PMCT-10min and the PMCT-2min thresholds of older drivers' predicted simulated crash risk, as well as the minimum distance of approach to all hazards. This suggests that our tests of motion processing can help predict the risk of collision or near collision in older drivers. Thresholds were also correlated with the total lane deviation time, suggesting a deficiency in processing of peripheral flow and delayed detection of adjacent cars. The PMCT-2min is an improved version of a previously validated test, and it has the potential to help assess older drivers' fitness to drive. Copyright © 2015 Elsevier Ltd. All rights reserved.
MASTODON: A geosciences simulation tool built using the open-source framework MOOSE
NASA Astrophysics Data System (ADS)
Slaughter, A.
2017-12-01
The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The geosciences community could benefit from existing tools by enabling collaboration between researchers and practitioners throughout the world and advance the state-of-the-art in line with other scientific research efforts.
Obstetric simulation as a risk control strategy: course design and evaluation.
Gardner, Roxane; Walzer, Toni B; Simon, Robert; Raemer, Daniel B
2008-01-01
Patient safety initiatives aimed at reducing medical errors and adverse events are being implemented in Obstetrics. The Controlled Risk Insurance Company (CRICO), Risk Management Foundation (RMF) of the Harvard Medical Institutions pursued simulation as an anesthesia risk control strategy. Encouraged by their success, CRICO/RMF promoted simulation-based team training as a risk control strategy for obstetrical providers. We describe the development, implementation, and evaluation of an obstetric simulation-based team training course grounded in crisis resource management (CRM) principles. We pursued systematic design of course development, implementation, and evaluation in 3 phases, including a 1-year or more posttraining follow-up with self-assessment questionnaires. The course was highly rated overall by participants immediately after the course and 1-year or more after the course. Most survey responders reported having experienced a critical clinical event since the course and that various aspects of their teamwork had significantly or somewhat improved as a result of the course. Most (86%) reported CRM principles as useful for obstetric faculty and most (59%) recommended repeating the simulation course every 2 years. A simulation-based team-training course for obstetric clinicians was developed and is a central component of CRICO/RMF's obstetric risk management incentive program that provides a 10% reduction in annual obstetrical malpractice premiums. The course was highly regarded immediately and 1 year or more after completing the course. Most survey responders reported improved teamwork and communication in managing a critical obstetric event in the interval since taking the course. Simulation-based CRM training can serve as a strategy for mitigating adverse perinatal events.
Engineering Risk Assessment of Space Thruster Challenge Problem
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie
2014-01-01
The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.
SIMULATING URBAN AIR TOXICS OVER CONTINENTAL AND URBAN SCALES
The US EPA is evaluating a version of the CMAQ model to support risk assessment for the exposure to Hazardous Air Pollutants (HAPs). The model uses a variant of the CB4 chemical mechanism to simulate ambient concentrations of twenty HAPs that exist primarily as gaseous compounds...
Andrade, Cristiane Ps; Souza, Cláudio J; Camerini, Eduardo Sn; Alves, Isabela S; Vital, Hélio C; Healy, Matthew Jf; Ramos De Andrade, Edson
2018-06-01
A radiological dispersive device (RDD) spreads radioactive material, complicates the treatment of physical injuries, raises cancer risk, and induces disproportionate fear. Simulating such an event enables more effective and efficient utilization of the triage and treatment resources of staff, facilities, and space. Fast simulation can give detail on events in progress or future events. The resources for triage and treatment of contaminated trauma victims can differ for pure exposure individuals, while discouraging the "worried well" from presenting in the crisis phase by media announcement would relieve pressure on hospital facilities. The proposed methodology integrates capabilities from different platforms in a convergent way composed of three phases: (a) scenario simulation, (b) data generation, and (c) risk assessment for triage focused on follow-up epidemiological assessment. Simulations typically indicate that most of the affected population does not require immediate medical assistance. Medical triage for the few severely injured and the radiological triage to diminish the contamination with radioactivity will always be the priority. For this study, however, higher priorities should be given to individuals from radiological "warm" and "hot" zones as required by risk criteria. The proposed methodology could thus help to (a) filter and reduce the number of individuals to be attended, (b) optimize the prioritization of medical care, (c) reduce or prepare for future costs, (d) effectively locate the operational triage site to avoid possible contamination on the main facility, and (e) provide the scientific data needed to develop an adequate approach to risk and its proper communication.
Assessment of risk due to the use of carbon fiber composites in commercial and general aviation
NASA Technical Reports Server (NTRS)
Fiksel, J.; Rosenfield, D.; Kalelkar, A.
1980-01-01
The development of a national risk profile for the total annual aircraft losses due to carbon fiber composite (CFC) usage through 1993 is discussed. The profile was developed using separate simulation methods for commercial and general aviation aircraft. A Monte Carlo method which was used to assess the risk in commercial aircraft is described. The method projects the potential usage of CFC through 1993, investigates the incidence of commercial aircraft fires, models the potential release and dispersion of carbon fibers from a fire, and estimates potential economic losses due to CFC damaging electronic equipment. The simulation model for the general aviation aircraft is described. The model emphasizes variations in facility locations and release conditions, estimates distribution of CFC released in general aviation aircraft accidents, and tabulates the failure probabilities and aggregate economic losses in the accidents.
River flood risk in Jakarta under scenarios of future change
NASA Astrophysics Data System (ADS)
Budiyono, Yus; Aerts, Jeroen C. J. H.; Tollenaar, Daniel; Ward, Philip J.
2016-03-01
Given the increasing impacts of flooding in Jakarta, methods for assessing current and future flood risk are required. In this paper, we use the Damagescanner-Jakarta risk model to project changes in future river flood risk under scenarios of climate change, land subsidence, and land use change. Damagescanner-Jakarta is a simple flood risk model that estimates flood risk in terms of annual expected damage, based on input maps of flood hazard, exposure, and vulnerability. We estimate baseline flood risk at USD 186 million p.a. Combining all future scenarios, we simulate a median increase in risk of +180 % by 2030. The single driver with the largest contribution to that increase is land subsidence (+126 %). We simulated the impacts of climate change by combining two scenarios of sea level rise with simulations of changes in 1-day extreme precipitation totals from five global climate models (GCMs) forced by the four Representative Concentration Pathways (RCPs). The results are highly uncertain; the median change in risk due to climate change alone by 2030 is a decrease by -46 %, but we simulate an increase in risk under 12 of the 40 GCM-RCP-sea level rise combinations. Hence, we developed probabilistic risk scenarios to account for this uncertainty. If land use change by 2030 takes places according to the official Jakarta Spatial Plan 2030, risk could be reduced by 12 %. However, if land use change in the future continues at the same rate as the last 30 years, large increases in flood risk will take place. Finally, we discuss the relevance of the results for flood risk management in Jakarta.
SIMulation of Medication Error induced by Clinical Trial drug labeling: the SIMME-CT study.
Dollinger, Cecile; Schwiertz, Vérane; Sarfati, Laura; Gourc-Berthod, Chloé; Guédat, Marie-Gabrielle; Alloux, Céline; Vantard, Nicolas; Gauthier, Noémie; He, Sophie; Kiouris, Elena; Caffin, Anne-Gaelle; Bernard, Delphine; Ranchon, Florence; Rioufol, Catherine
2016-06-01
To assess the impact of investigational drug labels on the risk of medication error in drug dispensing. A simulation-based learning program focusing on investigational drug dispensing was conducted. The study was undertaken in an Investigational Drugs Dispensing Unit of a University Hospital of Lyon, France. Sixty-three pharmacy workers (pharmacists, residents, technicians or students) were enrolled. Ten risk factors were selected concerning label information or the risk of confusion with another clinical trial. Each risk factor was scored independently out of 5: the higher the score, the greater the risk of error. From 400 labels analyzed, two groups were selected for the dispensing simulation: 27 labels with high risk (score ≥3) and 27 with low risk (score ≤2). Each question in the learning program was displayed as a simulated clinical trial prescription. Medication error was defined as at least one erroneous answer (i.e. error in drug dispensing). For each question, response times were collected. High-risk investigational drug labels correlated with medication error and slower response time. Error rates were significantly 5.5-fold higher for high-risk series. Error frequency was not significantly affected by occupational category or experience in clinical trials. SIMME-CT is the first simulation-based learning tool to focus on investigational drug labels as a risk factor for medication error. SIMME-CT was also used as a training tool for staff involved in clinical research, to develop medication error risk awareness and to validate competence in continuing medical education. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
An overview of the fire and fuels extension to the forest vegetation simulator
Sarah J. Beukema; Elizabeth D. Reinhardt; Werner A. Kurz; Nicholas L. Crookston
2000-01-01
The Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) has been developed to assess the risk, behavior, and impact of fire in forest ecosystems. This extension to the widely-used stand-dynamics model FVS simulates the dynamics of snags and surface fuels as they are affected by stand management (of trees or fuels), live tree growth and mortality,...
Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand
NASA Astrophysics Data System (ADS)
Kaiser, G.; Kortenhaus, A.
2009-04-01
The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world-wide available. However, to model tsunami-induced inundation for risk analysis and management purposes the accuracy of these data is not sufficient as the processes in the near-shore zone cannot be modelled accurately enough and the spatial resolution of the topography is weak. Moreover, the SRTM data provide a digital surface model which includes vegetation and buildings in the surface description. To improve the data basis additional bathymetric data were used in the near shore zone of the Phang Nga and Phuket coastlines and various remote sensing techniques as well as additional GPS measurements were applied to derive a high resolution topography from satellite and airborne data. Land use classifications and filter methods were developed to correct the digital surface models to digital elevation models. Simulations were then performed with a non-linear shallow water model to model the 2004 Asian Tsunami and to simulate possible future ones. Results of water elevation near the coast were compared with field measurements and observations, and the influence of the resolution of the topography on inundation patterns like water depth, velocity, dispersion and duration of the flood were analysed. The inundation simulation provides detailed hazard maps and is considered a reliable basis for risk assessment and risk zone mapping. Results are regarded vital for estimation of tsunami induced damages and evacuation planning. Results of the aforementioned simulations will be discussed during the conference. Differences of the numerical results using topographic data of different scales and modified by different post processing techniques will be analysed and explained. Further use of the results with respect to tsunami risk analysis and management will also be demonstrated.
Integration of PKPD relationships into benefit–risk analysis
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-01-01
Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398
Integration of PKPD relationships into benefit-risk analysis.
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-11-01
Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.
NASA Astrophysics Data System (ADS)
Huang, Xian; Betha, Raghu; Tan, Li Yun; Balasubramanian, Rajasekhar
2016-01-01
Smoke-haze episodes, caused by uncontrolled peat and forest fires, occur almost every year in the South-East Asian region with increased concentrations of PM2.5 (airborne particulate matter (PM) with diameter ≤ 2.5 μm). Particulate-bound trace elements (TrElems), especially carcinogenic and toxic elements, were measured during smoke haze as well as non-haze periods in 2014 as they are considered to be indicators of potential health effects. The bioaccessibilities of 13 TrElems were investigated using two types of simulated lung fluids (SLFs), Gamble's solution and artificial lysosomal fluid (ALF), instead of the commonly used leaching agent (water). The dissolution kinetics was also examined for these TrElems. Many TrElems showed higher solubility in SLFs, and were more soluble in ALF compared to the Gamble's solution. Cu, Mn and Cd were observed to be the most soluble trace elements in ALF, while in Gamble's solution the most soluble trace elements were Cu, Mn and Zn. The dissolution rates were highly variable among the elements. Health risk assessment was conducted based on the measured concentrations of TrElems and their corresponding toxicities for three possible scenarios involving interactions between carcinogenic and toxic TrElems and SLFs, using the United States Environmental Protection Agency (USEPA) human health risk assessment model. The cumulative cancer risks exceeded the acceptable level (1 in a million i.e. 1 × 10-6). However, the estimation of health quotient (HQ) indicated no significant chronic toxic health effects. The risk assessment results revealed that the assessment of bioaccessibility of particulate-bound TrElems using water as the leaching agent may underestimate the health risk.
Risk assessment of lambda-cyhalothrin on aquatic organisms in paddy field in China.
Gu, Bao G; Wang, Hui M; Chen, William L; Cai, Dao J; Shan, Zheng J
2007-06-01
This study was carried out to assess the risk of lambda-cyhalothrin to aquatic organisms used in paddy field, and to provide assistance in the ecological risk management of lambda-cyhalothrin. The acute toxicities of five individual formulations of lambda-cyhalothrin to four aquatic species were investigated in the laboratory, as well as in a simulated paddy field-pond ecosystem, and the results implicated that lambda-cyhalothrin is highly toxic to fish, and to a greater extent to shrimp. There were differences in the toxicities to each aquatic organisms among different formulations. lambda-Cyhalothrin degraded rapidly in the environment, with half-lives of different formulations in paddy field water (0.23-0.53 days), pond water (0.38-0.63 days), and paddy field soil (0.96-7.35 days), respectively. The water overflow from the paddy field following a simulated rainstorm 12h after application of lambda-cyhalothrin did not cause injury to fish, clam or crab, but was severely hazardous to shrimp. Additionally, no injury to shrimp was found when simulated overflow occurred 4 days after application. These results suggest that the environmental risk of lambda-cyhalothrin to aquatic organisms can be reduced by (1) developing a relatively safe formulation such as a suspension concentrate, and/or (2) controlling the drainage time of the paddy field.
NASA Astrophysics Data System (ADS)
Schiavon, Marco; Redivo, Martina; Antonacci, Gianluca; Rada, Elena Cristina; Ragazzi, Marco; Zardi, Dino; Giovannini, Lorenzo
2015-11-01
Simulations of emission and dispersion of nitrogen oxides (NOx) are performed in an urban area of Verona (Italy), characterized by street canyons and typical sources of urban pollutants. Two dominant source categories are considered: road traffic and, as an element of novelty, domestic heaters. Also, to assess the impact of urban air pollution on human health and, in particular, the cancer risk, simulations of emission and dispersion of benzene are carried out. Emissions from road traffic are estimated by the COPERT 4 algorithm, whilst NOx emission factors from domestic heaters are retrieved by means of criteria provided in the technical literature. Then maps of the annual mean concentrations of NOx and benzene are calculated using the AUSTAL2000 dispersion model, considering both scenarios representing the current situation, and scenarios simulating the introduction of environmental strategies for air pollution mitigation. The simulations highlight potentially critical situations of human exposure that may not be detected by the conventional network of air quality monitoring stations. The proposed methodology provides a support for air quality policies, such as planning targeted measurement campaigns, re-locating monitoring stations and adopting measures in favour of better air quality in urban planning. In particular, the estimation of the induced cancer risk is an important starting point to conduct zoning analyses and to detect the areas where population is more directly exposed to potential risks for health.
Lombardo, Andrea; Franco, Antonio; Pivato, Alberto; Barausse, Alberto
2015-03-01
Conventional approaches to estimating protective ecotoxicological thresholds of chemicals, i.e. predicted no-effect concentrations (PNEC), for an entire ecosystem are based on the use of assessment factors to extrapolate from single-species toxicity data derived in the laboratory to community-level effects on ecosystems. Aquatic food web models may be a useful tool to improve the ecological realism of chemical risk assessment because they enable a more insightful evaluation of the fate and effects of chemicals in dynamic trophic networks. A case study was developed in AQUATOX to simulate the effects of the anionic surfactant linear alkylbenzene sulfonate and the antimicrobial triclosan on a lowland riverine ecosystem. The model was built for a section of the River Thames (UK), for which detailed ecological surveys were available, allowing for a quantification of energy flows through the whole ecosystem. A control scenario was successfully calibrated for a simulation period of one year, and tested for stability over six years. Then, the model ecosystem was perturbed with varying inputs of the two chemicals. Simulations showed that both chemicals rapidly approach steady-state, with internal concentrations in line with the input bioconcentration factors throughout the year. At realistic environmental concentrations, both chemicals have insignificant effects on biomass trends. At hypothetical higher concentrations, direct and indirect effects of chemicals on the ecosystem dynamics emerged from the simulations. Indirect effects due to competition for food sources and predation can lead to responses in biomass density of the same magnitude as those caused by direct toxicity. Indirect effects can both exacerbate or compensate for direct toxicity. Uncertainties in key model assumptions are high as the validation of perturbed simulations remains extremely challenging. Nevertheless, the study is a step towards the development of realistic ecological scenarios and their potential use in prospective risk assessment of down-the-drain chemicals. Copyright © 2014 Elsevier B.V. All rights reserved.
Influence of safety measures on the risks of transporting dangerous goods through road tunnels.
Saccomanno, Frank; Haastrup, Palle
2002-12-01
Quantitative risk assessment (QRA) models are used to estimate the risks of transporting dangerous goods and to assess the merits of introducing alternative risk reduction measures for different transportation scenarios and assumptions. A comprehensive QRA model recently was developed in Europe for application to road tunnels. This model can assess the merits of a limited number of "native safety measures." In this article, we introduce a procedure for extending its scope to include the treatment of a number of important "nonnative safety measures" of interest to tunnel operators and decisionmakers. Nonnative safety measures were not included in the original model specification. The suggested procedure makes use of expert judgment and Monte Carlo simulation methods to model uncertainty in the revised risk estimates. The results of a case study application are presented that involve the risks of transporting a given volume of flammable liquid through a 10-km road tunnel.
A counterfactual p-value approach for benefit-risk assessment in clinical trials.
Zeng, Donglin; Chen, Ming-Hui; Ibrahim, Joseph G; Wei, Rachel; Ding, Beiying; Ke, Chunlei; Jiang, Qi
2015-01-01
Clinical trials generally allow various efficacy and safety outcomes to be collected for health interventions. Benefit-risk assessment is an important issue when evaluating a new drug. Currently, there is a lack of standardized and validated benefit-risk assessment approaches in drug development due to various challenges. To quantify benefits and risks, we propose a counterfactual p-value (CP) approach. Our approach considers a spectrum of weights for weighting benefit-risk values and computes the extreme probabilities of observing the weighted benefit-risk value in one treatment group as if patients were treated in the other treatment group. The proposed approach is applicable to single benefit and single risk outcome as well as multiple benefit and risk outcomes assessment. In addition, the prior information in the weight schemes relevant to the importance of outcomes can be incorporated in the approach. The proposed CPs plot is intuitive with a visualized weight pattern. The average area under CP and preferred probability over time are used for overall treatment comparison and a bootstrap approach is applied for statistical inference. We assess the proposed approach using simulated data with multiple efficacy and safety endpoints and compare its performance with a stochastic multi-criteria acceptability analysis approach.
Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W
2016-04-01
Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.
Model uncertainty estimation and risk assessment is essential to environmental management and informed decision making on pollution mitigation strategies. In this study, we apply a probabilistic methodology, which combines Bayesian Monte Carlo simulation and Maximum Likelihood e...
Risk assessment of tropical cyclone rainfall flooding in the Delaware River Basin
NASA Astrophysics Data System (ADS)
Lu, P.; Lin, N.; Smith, J. A.; Emanuel, K.
2016-12-01
Rainfall-induced inland flooding is a leading cause of death, injury, and property damage from tropical cyclones (TCs). In the context of climate change, it has been shown that extreme precipitation from TCs is likely to increase during the 21st century. Assessing the long-term risk of inland flooding associated with landfalling TCs is therefore an important task. Standard risk assessment techniques, which are based on observations from rain gauges and stream gauges, are not broadly applicable to TC induced flooding, since TCs are rare, extreme events with very limited historical observations at any specific location. Also, rain gauges and stream gauges can hardly capture the complex spatial variation of TC rainfall and flooding. Furthermore, the utility of historically based assessments is compromised by climate change. Regional dynamical downscaling models can resolve many features of TC precipitation. In terms of risk assessment, however, it is computationally demanding to run such models to obtain long-term climatology of TC induced flooding. Here we apply a computationally efficient climatological-hydrological method to assess the risk of inland flooding associated with landfalling TCs. It includes: 1) a deterministic TC climatology modeling method to generate large numbers of synthetic TCs with physically correlated characteristics (i.e., track, intensity, size) under observed and projected climates; 2) a simple physics-based tropical cyclone rainfall model which is able to simulate rainfall fields associated with each synthetic storm; 3) a hydrologic modeling system that takes in rainfall fields to simulate flood peaks over an entire drainage basin. We will present results of this method applied to the Delaware River Basin in the mid-Atlantic US.
Low-thrust mission risk analysis, with application to a 1980 rendezvous with the comet Encke
NASA Technical Reports Server (NTRS)
Yen, C. L.; Smith, D. B.
1973-01-01
A computerized failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust sybsystem burn operation, the system failure processes, and the retargeting operations. The method is applied to assess the risks in carrying out a 1980 rendezvous mission to the comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates are the limiting factors in attaining a high mission relability. It is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.
Safety impacts of red light cameras at signalized intersections based on cellular automata models.
Chai, C; Wong, Y D; Lum, K M
2015-01-01
This study applies a simulation technique to evaluate the hypothesis that red light cameras (RLCs) exert important effects on accident risks. Conflict occurrences are generated by simulation and compared at intersections with and without RLCs to assess the impact of RLCs on several conflict types under various traffic conditions. Conflict occurrences are generated through simulating vehicular interactions based on an improved cellular automata (CA) model. The CA model is calibrated and validated against field observations at approaches with and without RLCs. Simulation experiments are conducted for RLC and non-RLC intersections with different geometric layouts and traffic demands to generate conflict occurrences that are analyzed to evaluate the hypothesis that RLCs exert important effects on road safety. The comparison of simulated conflict occurrences show favorable safety impacts of RLCs on crossing conflicts and unfavorable impacts for rear-end conflicts during red/amber phases. Corroborative results are found from broad analysis of accident occurrence. RLCs are found to have a mixed effect on accident risk at signalized intersections: crossing collisions are reduced, whereas rear-end collisions may increase. The specially developed CA model is found to be a feasible safety assessment tool.
Crotta, Matteo; Paterlini, Franco; Rizzi, Rita; Guitian, Javier
2016-02-01
Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time-temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time-temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Using climate model simulations to assess the current climate risk to maize production
NASA Astrophysics Data System (ADS)
Kent, Chris; Pope, Edward; Thompson, Vikki; Lewis, Kirsty; Scaife, Adam A.; Dunstone, Nick
2017-05-01
The relationship between the climate and agricultural production is of considerable importance to global food security. However, there has been relatively little exploration of climate-variability related yield shocks. The short observational yield record does not adequately sample natural inter-annual variability thereby limiting the accuracy of probability assessments. Focusing on the United States and China, we present an innovative use of initialised ensemble climate simulations and a new agro-climatic indicator, to calculate the risk of severe water stress. Combined, these regions provide 60% of the world’s maize, and therefore, are crucial to global food security. To probe a greater range of inter-annual variability, the indicator is applied to 1400 simulations of the present day climate. The probability of severe water stress in the major maize producing regions is quantified, and in many regions an increased risk is found compared to calculations from observed historical data. Analysis suggests that the present day climate is also capable of producing unprecedented severe water stress conditions. Therefore, adaptation plans and policies based solely on observed events from the recent past may considerably under-estimate the true risk of climate-related maize shocks. The probability of a major impact event occurring simultaneously across both regions—a multi-breadbasket failure—is estimated to be up to 6% per decade and arises from a physically plausible climate state. This novel approach highlights the significance of climate impacts on crop production shocks and provides a platform for considerably improving food security assessments, in the present day or under a changing climate, as well as development of new risk based climate services.
The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...
Fish endpoints measured in early life stage toxicity tests are often used as representative of larval amphibian sensitivity in Ecological Risk Assessment (ERA). This application potentially overlooks the impact of developmental delays on amphibian metamorphosis, and thereby red...
Frickel, Scott; Nguyen, Daniel; Bui, Tap; Echsner, Stephen; Simon, Bridget R.; Howard, Jessi L.; Miller, Kent; Wickliffe, Jeffrey K.
2014-01-01
Background: The Deepwater Horizon oil spill of 2010 prompted concern about health risks among seafood consumers exposed to polycyclic aromatic hydrocarbons (PAHs) via consumption of contaminated seafood. Objective: The objective of this study was to conduct population-specific probabilistic health risk assessments based on consumption of locally harvested white shrimp (Litopenaeus setiferus) among Vietnamese Americans in southeast Louisiana. Methods: We conducted a survey of Vietnamese Americans in southeast Louisiana to evaluate shrimp consumption, preparation methods, and body weight among shrimp consumers in the disaster-impacted region. We also collected and chemically analyzed locally harvested white shrimp for 81 individual PAHs. We combined the PAH levels (with accepted reference doses) found in the shrimp with the survey data to conduct Monte Carlo simulations for probabilistic noncancer health risk assessments. We also conducted probabilistic cancer risk assessments using relative potency factors (RPFs) to estimate cancer risks from the intake of PAHs from white shrimp. Results: Monte Carlo simulations were used to generate hazard quotient distributions for noncancer health risks, reported as mean ± SD, for naphthalene (1.8 × 10–4 ± 3.3 × 10–4), fluorene (2.4 × 10–5 ± 3.3 × 10–5), anthracene (3.9 × 10–6 ± 5.4 × 10–6), pyrene (3.2 × 10–5 ± 4.3 × 10–5), and fluoranthene (1.8 × 10–4 ± 3.3 × 10–4). A cancer risk distribution, based on RPF-adjusted PAH intake, was also generated (2.4 × 10–7 ± 3.9 × 10–7). Conclusions: The risk assessment results show no acute health risks or excess cancer risk associated with consumption of shrimp containing the levels of PAHs detected in our study, even among frequent shrimp consumers. Citation: Wilson MJ, Frickel S, Nguyen D, Bui T, Echsner S, Simon BR, Howard JL, Miller K, Wickliffe JK. 2015. A targeted health risk assessment following the Deepwater Horizon Oil Spill: polycyclic aromatic hydrocarbon exposure in Vietnamese-American shrimp consumers. Environ Health Perspect 123:152–159; http://dx.doi.org/10.1289/ehp.1408684 PMID:25333566
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Di Gianfilippo, Martina; Verginelli, Iason; Costa, Giulia; Spagnuolo, Riccardo; Gavasci, Renato; Lombardi, Francesco
2018-01-01
In this work we present an integrated risk-based approach that can be used to evaluate the recycling potential of an alkaline waste material such as incineration bottom ash (BA) as unbound material for road sub-base construction. This approach, which is aimed at assessing potential risks to the groundwater resource (in terms of drinking water quality) and human health associated to the leaching of contaminants from the BA, couples the results of leaching tests for the estimation of source concentrations with the fate and transport models usually adopted in risk assessment procedures. The effects of weathering and of the type of leaching test employed to evaluate eluate concentrations were assessed by carrying out different simulations using the results of laboratory leaching tests. Specifically, pH-dependence and column percolation leaching tests were performed on freshly collected and 1-year naturally weathered BA samples produced from a grate-fired incineration plant treating Refuse Derived Fuel (RDF). To evaluate a broad span of possible scenario conditions, a Monte Carlo analysis was performed running 5000 simulations, randomly varying the input parameters within the ranges expected in the field. In nearly all the simulated conditions, the concentrations of contaminants in the groundwater for the specific type of BA tested in this work were well below EU and WHO drinking water quality criteria. Nevertheless, some caution should be paid in the case of the establishment of acidic conditions in the field since in this case the concentration of some elements (i.e. Al, Pb and Zn) is expected to exceed threshold values. In terms of risks to human health, for the considered utilization scenario the probability of exceeding the acceptable reference dose for water ingestion was usually less than 1% (except for Cr and Pb for which the probability was lower than 3.5% and 7%, respectively). Copyright © 2017 Elsevier Ltd. All rights reserved.
Silvey, Dustin; Behm, David; Albert, Wayne J.
2015-01-01
Young drivers are overrepresented in collisions resulting in fatalities. It is not uncommon for young drivers to socially binge drink and decide to drive a vehicle a few hours after consumption. To better understand the risks that may be associated with this behaviour, the present study has examined the effects of a social drinking bout followed by a simulated drive in undergraduate students on the descending limb of their BAC (blood alcohol concentration) curve. Two groups of eight undergraduate students (n = 16) took part in this study. Participants in the alcohol group were assessed before drinking, then at moderate and low BAC as well as 24 hours post-acute consumption. This group consumed an average of 5.3 ± 1.4 (mean ± SD) drinks in an hour in a social context and were then submitted to a driving and a predicted crash risk assessment. The control group was assessed at the same time points without alcohol intake or social context.; at 8 a.m., noon, 3 p.m. and 8 a.m. the next morning. These multiple time points were used to measure any potential learning effects from the assessment tools (i.e. driving simulator and useful field of view test (UFOV)). Diminished driving performance at moderate BAC was observed with no increases in predicted crash risk. Moderate correlations between driving variables were observed. No association exists between driving variables and UFOV variables. The control group improved measures of selective attention after the third asessement. No learning effect was observed from multiple sessions with the driving simulator. Our results show that a moderate BAC, although legal, increases the risky behaviour. Effects of alcohol expectancy could have been displayed by the experimental group. UFOV measures and predicted crash risk categories were not sentitive enough to predict crash risk for young drivers, even when intoxicated. PMID:25723618
Schechter, Clyde B; Near, Aimee M; Jayasekera, Jinani; Chandler, Young; Mandelblatt, Jeanne S
2018-04-01
The Georgetown University-Albert Einstein College of Medicine breast cancer simulation model (Model GE) has evolved over time in structure and function to reflect advances in knowledge about breast cancer, improvements in early detection and treatment technology, and progress in computing resources. This article describes the model and provides examples of model applications. The model is a discrete events microsimulation of single-life histories of women from multiple birth cohorts. Events are simulated in the absence of screening and treatment, and interventions are then applied to assess their impact on population breast cancer trends. The model accommodates differences in natural history associated with estrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2) biomarkers, as well as conventional breast cancer risk factors. The approach for simulating breast cancer natural history is phenomenological, relying on dates, stage, and age of clinical and screen detection for a tumor molecular subtype without explicitly modeling tumor growth. The inputs to the model are regularly updated to reflect current practice. Numerous technical modifications, including the use of object-oriented programming (C++), and more efficient algorithms, along with hardware advances, have increased program efficiency permitting simulations of large samples. The model results consistently match key temporal trends in US breast cancer incidence and mortality. The model has been used in collaboration with other CISNET models to assess cancer control policies and will be applied to evaluate clinical trial design, recurrence risk, and polygenic risk-based screening.
Shih, Hsiu-Ching; Crawford-Brown, Douglas; Ma, Hwong-wen
2015-03-15
Assessment of the ability of climate policies to produce desired improvements in public health through co-benefits of air pollution reduction can consume resources in both time and research funds. These resources increase significantly as the spatial resolution of models increases. In addition, the level of spatial detail available in macroeconomic models at the heart of climate policy assessments is much lower than that available in traditional human health risk modeling. It is therefore important to determine whether increasing spatial resolution considerably affects risk-based decisions; which kinds of decisions might be affected; and under what conditions they will be affected. Human health risk co-benefits from carbon emissions reductions that bring about concurrent reductions in Particulate Matter (PM10) emissions is therefore examined here at four levels of spatial resolution (Uniform Nation, Uniform Region, Uniform County/city, Health Risk Assessment) in a case study of Taiwan as one of the geographic regions of a global macroeceonomic model, with results that are representative of small, industrialized nations within that global model. A metric of human health risk mortality (YOLL, years of life lost in life expectancy) is compared under assessments ranging from a "uniform simulation" in which there is no spatial resolution of changes in ambient air concentration under a policy to a "highly spatially resolved simulation" (called here Health Risk Assessment). PM10 is chosen in this study as the indicator of air pollution for which risks are assessed due to its significance as a co-benefit of carbon emissions reductions within climate mitigation policy. For the policy examined, the four estimates of mortality in the entirety of Taiwan are 747 YOLL, 834 YOLL, 984 YOLL and 916 YOLL, under Uniform Taiwan, Uniform Region, Uniform County and Health Risk Assessment respectively; or differences of 18%, 9%, 7% if the HRA methodology is taken as the baseline. While these differences are small compared to uncertainties in health risk assessment more generally, the ranks of different regions and of emissions categories as the focus of regulatory efforts estimated at these four levels of spatial resolution are quite different. The results suggest that issues of risk equity within a nation might be missed by the lower levels of spatial resolution, suggesting that low resolution models are suited to calculating national cost-benefit ratios but not as suited to assessing co-benefits of climate policies reflecting intersubject variability in risk, or in identifying sub-national regions and emissions sectors on which to focus attention (although even here, the errors introduced by low spatial resolution are generally less than 40%). Copyright © 2014 Elsevier Ltd. All rights reserved.
Cancer risk coefficient for patient undergoing kyphoplasty surgery using Monte Carlo method
NASA Astrophysics Data System (ADS)
Santos, Felipe A.; Santos, William S.; Galeano, Diego C.; Cavalcante, Fernanda R.; Silva, Ademir X.; Souza, Susana O.; Júnior, Albérico B. Carvalho
2017-11-01
Kyphoplasty surgery is widely used for pain relief in patients with vertebral compression fracture (VCF). For this surgery, an X-ray emitter that provides real-time imaging is employed to guide the medical instruments and the surgical cement used to fill and strengthen the vertebra. Equivalent and effective doses related to high temporal resolution equipment has been studied to assess the damage and more recently cancer risk. For this study, a virtual scenario was prepared using MCNPX code and a pair of UF family simulators. Two projections with seven tube voltages for each one were simulated. The organ in the abdominal region were those who had higher cancer risk because they receive the primary beam. The risk of lethal cancer is on average 20% higher in AP projection than in LL projection. This study aims at estimating the risk of cancer in organs and the risk of lethal cancer for patient submitted to kyphoplasty surgery.
Topping, Christopher John; Kjaer, Lene Jung; Hommen, Udo; Høye, Toke Thomas; Preuss, Thomas G; Sibly, Richard M; van Vliet, Peter
2014-07-01
Current European Union regulatory risk assessment allows application of pesticides provided that recovery of nontarget arthropods in-crop occurs within a year. Despite the long-established theory of source-sink dynamics, risk assessment ignores depletion of surrounding populations and typical field trials are restricted to plot-scale experiments. In the present study, the authors used agent-based modeling of 2 contrasting invertebrates, a spider and a beetle, to assess how the area of pesticide application and environmental half-life affect the assessment of recovery at the plot scale and impact the population at the landscape scale. Small-scale plot experiments were simulated for pesticides with different application rates and environmental half-lives. The same pesticides were then evaluated at the landscape scale (10 km × 10 km) assuming continuous year-on-year usage. The authors' results show that recovery time estimated from plot experiments is a poor indicator of long-term population impact at the landscape level and that the spatial scale of pesticide application strongly determines population-level impact. This raises serious doubts as to the utility of plot-recovery experiments in pesticide regulatory risk assessment for population-level protection. Predictions from the model are supported by empirical evidence from a series of studies carried out in the decade starting in 1988. The issues raised then can now be addressed using simulation. Prediction of impacts at landscape scales should be more widely used in assessing the risks posed by environmental stressors. © 2014 SETAC.
Risk assessment of watershed erosion at Naesung Stream, South Korea.
Ji, Un; Velleux, Mark; Julien, Pierre Y; Hwang, Manha
2014-04-01
A three-tiered approach was used to assess erosion risks within the Nakdong River Basin in South Korea and included: (1) a screening based on topography and land use; (2) a lumped parameter analysis using RUSLE; and (3) a detailed analysis using TREX, a fully distributed watershed model. These tiers span a range of spatial and temporal scales, with each tier providing increasing detail and resolution. The first two tiers were applied to the entire Nakdong River Basin and the Naesung Stream watershed was identified as having the highest soil erosion risk and potential for sedimentation problems. For the third tier, the TREX watershed model simulated runoff, channel flow, soil erosion, and stream sediment transport in the Naesung Stream watershed at very high resolution. TREX was calibrated for surface flows and sediment transport, and was used to simulate conditions for a large design storm. Highly erosive areas were identified along ridgelines in several headwater areas, with the northeast area of Songriwon having a particularly high erosion potential. Design storm simulations also indicated that sediment deposition of up to 55 cm could occur. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Food Quality Protection Act (FQPA) demands that exposure of infants and children to pesticide residues from non-dietary sources be included in EPA's aggregate risk assessment. Ideally, the informed assessment would aggregate exposures from all reasonable sources, primarily ...
Roquelaure, Yves; Fouquet, Natacha; Chazelle, Emilie; Descatha, Alexis; Evanoff, Bradley; Bodin, Julie; Petit, Audrey
2018-04-02
Carpal tunnel syndrome (CTS) is the most common nerve entrapment neuropathy in the working-age population. The reduction of CTS incidence in the workforce is a priority for policy makers due to the human, social and economic costs. To assess the theoretical impact of workplace-based primary interventions designed to reduce exposure to personal and/or work-related risk factors for CTS. Surgical CTS were assessed using regional hospital discharge records for persons aged 20-59 in 2009. Using work-related attributable fractions (AFEs), we estimated the number of work-related CTS (WR-CTS) in high-risk jobs. We simulated three theoretical scenarios of workplace-based primary prevention for jobs at risk: a mono-component work-centered intervention reducing the incidence of WR-CTS arbitrarily by 10% (10%-WI), and multicomponent global interventions reducing the incidence of all surgical CTS by 5% and 10% by targeting personal and work risk factors. A limited proportion of CTS were work-related in the region's population. WR-CTS were concentrated in nine jobs at high risk of CTS, amounting to 1603 [1137-2212] CTS, of which 906 [450-1522] were WR-CTS. The 10%-WI, 5%-GI and 10%-GI hypothetically prevented 90 [46-153], 81 [58-111] and 159 [114-223] CTS, respectively. The 10%-GI had the greatest impact regardless of the job. The impact of the 10%-WI interventions was high only in jobs at highest risk and AFEs (e.g. food industry jobs). The 10%-WI and 5%-GI had a similar impact for moderate-risk jobs (e.g. healthcare jobs). The impact of simulated workplace-based interventions suggests that prevention efforts to reduce exposure to work-related risk factors should focus on high-risk jobs. Reducing CTS rates will also require integrated strategies to reduce personal risk factors, particularly in jobs with low levels of work-related risk of CTS.
Program risk analysis handbook
NASA Technical Reports Server (NTRS)
Batson, R. G.
1987-01-01
NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.
Schaffner, Donald W; Bowman, James P; English, Donald J; Fischler, George E; Fuls, Janice L; Krowka, John F; Kruszewski, Francis H
2014-04-01
There are conflicting reports on whether antibacterial hand hygiene products are more effective than nonantibacterial products in reducing bacteria on hands and preventing disease. This research used new laboratory data, together with simulation techniques, to compare the ability of nonantibacterial and antibacterial products to reduce shigellosis risk. One hundred sixtythree subjects were used to compare five different hand treatments: two nonantibacterial products and three antibacterial products, i.e., 0.46% triclosan, 4% chlorhexidine gluconate, or 62% ethyl alcohol. Hands were inoculated with 5.5 to 6 log CFU Shigella; the simulated food handlers then washed their hands with one of the five products before handling melon balls. Each simulation scenario represented an event in which 100 people would be exposed to Shigella from melon balls that had been handled by food workers with Shigella on their hands. Analysis of experimental data showed that the two nonantibacterial treatments produced about a 2-log reduction on hands. The three antibacterial treatments showed log reductions greater than 3 but less than 4 on hands. All three antibacterial treatments resulted in statistically significantly lower concentration on the melon balls relative to the nonantibacterial treatments. A simulation that assumed 1 million Shigella bacteria on the hands and the use of a nonantibacterial treatment predicted that 50 to 60 cases of shigellosis would result (of 100 exposed). Each of the antibacterial treatments was predicted to result in an appreciable number of simulations for which the number of illness cases would be 0, with the most common number of illness cases being 5 (of 100 exposed). These effects maintained statistical significance from 10(6) Shigella per hand down to as low as 100 Shigella per hand, with some evidence to support lower levels. This quantitative microbial risk assessment shows that antibacterial hand treatments can significantly reduce Shigella risk.
Shen, Jiabin; Pang, Shulan; Schwebel, David C
2016-06-01
Unintentional drowning is the most common cause of childhood death in rural China. Global intervention efforts offer mixed results regarding the efficacy of educational programs. Using a randomized controlled design, we evaluated a testimonial-based intervention to reduce drowning risk among 280 3rd- and 4th-grade rural Chinese children. Children were randomly assigned to view either testimonials on drowning risk (intervention) or dog-bite risk (control). Safety knowledge and perceived vulnerability were measured by self-report questionnaires, and simulated behaviors in and near water were assessed with a culturally appropriate dollhouse task. Children in the intervention group had improved children's safety knowledge and simulated behaviors but not perceived vulnerability compared with controls. The testimonial-based intervention's efficacy appears promising, as it improved safety knowledge and simulated risk behaviors with water among rural Chinese children. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Pérez-Rodríguez, F; van Asselt, E D; Garcia-Gimeno, R M; Zurera, G; Zwietering, M H
2007-05-01
The risk assessment study of Listeria monocytogenes in ready-to-eat foods conducted by the U.S. Food and Drug Administration is an example of an extensive quantitative microbiological risk assessment that could be used by risk analysts and other scientists to obtain information and by managers and stakeholders to make decisions on food safety management. The present study was conducted to investigate how detailed sensitivity analysis can be used by assessors to extract more information on risk factors and how results can be communicated to managers and stakeholders in an understandable way. The extended sensitivity analysis revealed that the extremes at the right side of the dose distribution (at consumption, 9 to 11.5 log CFU per serving) were responsible for most of the cases of listeriosis simulated. For concentration at retail, values below the detection limit of 0.04 CFU/g and the often used limit for L. monocytogenes of 100 CFU/g (also at retail) were associated with a high number of annual cases of listeriosis (about 29 and 82%, respectively). This association can be explained by growth of L. monocytogenes at both average and extreme values of temperature and time, indicating that a wide distribution can lead to high risk levels. Another finding is the importance of the maximal population density (i.e., the maximum concentration of L. monocytogenes assumed at a certain temperature) for accurately estimating the risk of infection by opportunistic pathogens such as L. monocytogenes. According to the obtained results, mainly concentrations corresponding to the highest maximal population densities caused risk in the simulation. However, sensitivity analysis applied to the uncertainty parameters revealed that prevalence at retail was the most important source of uncertainty in the model.
Increased cognitive load leads to impaired mobility decisions in seniors at risk for falls.
Nagamatsu, Lindsay S; Voss, Michelle; Neider, Mark B; Gaspar, John G; Handy, Todd C; Kramer, Arthur F; Liu-Ambrose, Teresa Y L
2011-06-01
Successful mobility requires appropriate decision-making. Seniors with reduced executive functioning-such as senior fallers-may be prone to poor mobility judgments, especially under dual-task conditions. We classified participants as "At-Risk" and "Not-At-Risk" for falls using a validated physiological falls-risk assessment. Dual-task performance was assessed in a virtual reality environment where participants crossed a simulated street by walking on a manual treadmill while listening to music or conversing on a phone. Those "At-Risk" experienced more collisions with oncoming cars and had longer crossing times in the Phone condition compared to controls. We conclude that poor mobility judgments during a dual-task leads to unsafe mobility for those at-risk for falls. (c) 2011 APA, all rights reserved.
System-Level Reuse of Space Systems Simulations
NASA Technical Reports Server (NTRS)
Hazen, Michael R.; Williams, Joseph C.
2004-01-01
One of the best ways to enhance space systems simulation fidelity is to leverage off of (reuse) existing high-fidelity simulations. But what happens when the model you would like to reuse is in a different coding language or other barriers arise that make one want to just start over with a clean sheet of paper? Three diverse system-level simulation reuse case studies are described based on experience to date in the development of NASA's Space Station Training Facility (SSTF) at the Johnson Space Center in Houston, Texas. Case studies include (a) the Boeing/Rocketdyne-provided Electrical Power Simulation (EPSIM), (b) the NASA Automation and Robotics Division-provided TRICK robotics systems model, and (c) the Russian Space Agency- provided Russian Segment Trainer. In each case, there was an initial tendency to dismiss simulation reuse candidates based on an apparent lack of suitability. A more careful examination based on a more structured assessment of architectural and requirements-oriented representations of the reuse candidates revealed significant reuse potential. Specific steps used to conduct the detailed assessments are discussed. The steps include the following: 1) Identifying reuse candidates; 2) Requirements compatibility assessment; 3) Maturity assessment; 4) Life-cycle cost determination; and 5) Risk assessment. Observations and conclusions are presented related to the real cost of system-level simulation component reuse. Finally, lessons learned that relate to maximizing the benefits of space systems simulation reuse are shared. These concepts should be directly applicable for use in the development of space systems simulations in the future.
NASA Astrophysics Data System (ADS)
Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang
2010-05-01
CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces
Fan, Jianqing; Liao, Yuan; Shi, Xiaofeng
2014-01-01
The risk of a large portfolio is often estimated by substituting a good estimator of the volatility matrix. However, the accuracy of such a risk estimator is largely unknown. We study factor-based risk estimators under a large amount of assets, and introduce a high-confidence level upper bound (H-CLUB) to assess the estimation. The H-CLUB is constructed using the confidence interval of risk estimators with either known or unknown factors. We derive the limiting distribution of the estimated risks in high dimensionality. We find that when the dimension is large, the factor-based risk estimators have the same asymptotic variance no matter whether the factors are known or not, which is slightly smaller than that of the sample covariance-based estimator. Numerically, H-CLUB outperforms the traditional crude bounds, and provides an insightful risk assessment. In addition, our simulated results quantify the relative error in the risk estimation, which is usually negligible using 3-month daily data. PMID:26195851
Hawken, Steven; Kwong, Jeffrey C; Deeks, Shelley L; Crowcroft, Natasha S; McGeer, Allison J; Ducharme, Robin; Campitelli, Michael A; Coyle, Doug; Wilson, Kumanan
2015-02-01
It is unclear whether seasonal influenza vaccination results in a net increase or decrease in the risk for Guillain-Barré syndrome (GBS). To assess the effect of seasonal influenza vaccination on the absolute risk of acquiring GBS, we used simulation models and published estimates of age- and sex-specific risks for GBS, influenza incidence, and vaccine effectiveness. For a hypothetical 45-year-old woman and 75-year-old man, excess GBS risk for influenza vaccination versus no vaccination was -0.36/1 million vaccinations (95% credible interval -1.22% to 0.28) and -0.42/1 million vaccinations (95% credible interval, -3.68 to 2.44), respectively. These numbers represent a small absolute reduction in GBS risk with vaccination. Under typical conditions (e.g. influenza incidence rates >5% and vaccine effectiveness >60%), vaccination reduced GBS risk. These findings should strengthen confidence in the safety of influenza vaccine and allow health professionals to better put GBS risk in context when discussing influenza vaccination with patients.
Rosenquist, Hanne; Nielsen, Niels L; Sommer, Helle M; Nørrung, Birgit; Christensen, Bjarke B
2003-05-25
A quantitative risk assessment comprising the elements hazard identification, hazard characterization, exposure assessment, and risk characterization has been prepared to assess the effect of different mitigation strategies on the number of human cases in Denmark associated with thermophilic Campylobacter spp. in chickens. To estimate the human exposure to Campylobacter from a chicken meal and the number of human cases associated with this exposure, a mathematical risk model was developed. The model details the spread and transfer of Campylobacter in chickens from slaughter to consumption and the relationship between ingested dose and the probability of developing campylobacteriosis. Human exposure was estimated in two successive mathematical modules. Module 1 addresses changes in prevalence and numbers of Campylobacter on chicken carcasses throughout the processing steps of a slaughterhouse. Module 2 covers the transfer of Campylobacter during food handling in private kitchens. The age and sex of consumers were included in this module to introduce variable hygiene levels during food preparation and variable sizes and compositions of meals. Finally, the outcome of the exposure assessment modules was integrated with a Beta-Poisson dose-response model to provide a risk estimate. Simulations designed to predict the effect of different mitigation strategies showed that the incidence of campylobacteriosis associated with consumption of chicken meals could be reduced 30 times by introducing a 2 log reduction of the number of Campylobacter on the chicken carcasses. To obtain a similar reduction of the incidence, the flock prevalence should be reduced approximately 30 times or the kitchen hygiene improved approximately 30 times. Cross-contamination from positive to negative flocks during slaughter had almost no effect on the human Campylobacter incidence, which indicates that implementation of logistic slaughter will only have a minor influence on the risk. Finally, the simulations showed that people in the age of 18-29 years had the highest risk of developing campylobacteriosis.
2017-06-01
Chemical Transformation Simulator (CTS) was developed by the U.S. Environmental Protection Agency to provide physicochemical properties of complex...Site Model CTS Chemical Transformation Simulator developed by EPA D4EM Data for Environmental Modeling Demo Demolition area DNAN 2,4...U.S. Environmental Protection Agency (EPA), was the technical point-of-contact for the Contaminant Transformation Simulator (CTS) that was
Gedamke, Jason; Gales, Nick; Frydman, Sascha
2011-01-01
The potential for seismic airgun "shots" to cause acoustic trauma in marine mammals is poorly understood. There are just two empirical measurements of temporary threshold shift (TTS) onset levels from airgun-like sounds in odontocetes. Considering these limited data, a model was developed examining the impact of individual variability and uncertainty on risk assessment of baleen whale TTS from seismic surveys. In each of 100 simulations: 10000 "whales" are assigned TTS onset levels accounting for: inter-individual variation; uncertainty over the population's mean; and uncertainty over weighting of odontocete data to obtain baleen whale onset levels. Randomly distributed whales are exposed to one seismic survey passage with cumulative exposure level calculated. In the base scenario, 29% of whales (5th/95th percentiles of 10%/62%) approached to 1-1.2 km range were exposed to levels sufficient for TTS onset. By comparison, no whales are at risk outside 0.6 km when uncertainty and variability are not considered. Potentially "exposure altering" parameters (movement, avoidance, surfacing, and effective quiet) were also simulated. Until more research refines model inputs, the results suggest a reasonable likelihood that whales at a kilometer or more from seismic surveys could potentially be susceptible to TTS and demonstrate that the large impact uncertainty and variability can have on risk assessment.
Risk assessment of logistics outsourcing based on BP neural network
NASA Astrophysics Data System (ADS)
Liu, Xiaofeng; Tian, Zi-you
The purpose of this article is to evaluate the risk of the enterprises logistics outsourcing. To get this goal, the paper first analysed he main risks existing in the logistics outsourcing, and then set up a risk evaluation index system of the logistics outsourcing; second applied BP neural network into the logistics outsourcing risk evaluation and used MATLAB to the simulation. It proved that the network error is small and has strong practicability. And this method can be used by enterprises to evaluate the risks of logistics outsourcing.
Simulating spatial and temporally related fire weather
Isaac C. Grenfell; Mark Finney; Matt Jolly
2010-01-01
Use of fire behavior models has assumed an increasingly important role for managers of wildfire incidents to make strategic decisions. For fire risk assessments and danger rating at very large spatial scales, these models depend on fire weather variables or fire danger indices. Here, we describe a method to simulate fire weather at a national scale that captures the...
Simulation Technology for Skills Training and Competency Assessment in Medical Education
Obeso, Vivian T.; Issenberg, S. Barry
2007-01-01
Medical education during the past decade has witnessed a significant increase in the use of simulation technology for teaching and assessment. Contributing factors include: changes in health care delivery and academic environments that limit patient availability as educational opportunities; worldwide attention focused on the problem of medical errors and the need to improve patient safety; and the paradigm shift to outcomes-based education with its requirements for assessment and demonstration of competence. The use of simulators addresses many of these issues: they can be readily available at any time and can reproduce a wide variety of clinical conditions on demand. In lieu of the customary (and arguably unethical) system, whereby novices carry out the practice required to master various techniques—including invasive procedures—on real patients, simulation-based education allows trainees to hone their skills in a risk-free environment. Evaluators can also use simulators for reliable assessments of competence in multiple domains. For those readers less familiar with medical simulators, this article aims to provide a brief overview of these educational innovations and their uses; for decision makers in medical education, we hope to broaden awareness of the significant potential of these new technologies for improving physician training and assessment, with a resultant positive impact on patient safety and health care outcomes. PMID:18095044
Airoldi, Edoardo M.; Bai, Xue; Malin, Bradley A.
2011-01-01
We live in an increasingly mobile world, which leads to the duplication of information across domains. Though organizations attempt to obscure the identities of their constituents when sharing information for worthwhile purposes, such as basic research, the uncoordinated nature of such environment can lead to privacy vulnerabilities. For instance, disparate healthcare providers can collect information on the same patient. Federal policy requires that such providers share “de-identified” sensitive data, such as biomedical (e.g., clinical and genomic) records. But at the same time, such providers can share identified information, devoid of sensitive biomedical data, for administrative functions. On a provider-by-provider basis, the biomedical and identified records appear unrelated, however, links can be established when multiple providers’ databases are studied jointly. The problem, known as trail disclosure, is a generalized phenomenon and occurs because an individual’s location access pattern can be matched across the shared databases. Due to technical and legal constraints, it is often difficult to coordinate between providers and thus it is critical to assess the disclosure risk in distributed environments, so that we can develop techniques to mitigate such risks. Research on privacy protection has so far focused on developing technologies to suppress or encrypt identifiers associated with sensitive information. There is growing body of work on the formal assessment of the disclosure risk of database entries in publicly shared databases, but a less attention has been paid to the distributed setting. In this research, we review the trail disclosure problem in several domains with known vulnerabilities and show that disclosure risk is influenced by the distribution of how people visit service providers. Based on empirical evidence, we propose an entropy metric for assessing such risk in shared databases prior to their release. This metric assesses risk by leveraging the statistical characteristics of a visit distribution, as opposed to person-level data. It is computationally efficient and superior to existing risk assessment methods, which rely on ad hoc assessment that are often computationally expensive and unreliable. We evaluate our approach on a range of location access patterns in simulated environments. Our results demonstrate the approach is effective at estimating trail disclosure risks and the amount of self-information contained in a distributed system is one of the main driving factors. PMID:21647242
NASA Astrophysics Data System (ADS)
Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.
2018-02-01
Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.
Nagoski, Emily; Janssen, Erick; Lohrmann, David; Nichols, Eric
2012-08-01
Risky sexual behaviors, including the decision to have unprotected sex, result from interactions between individuals and their environment. The current study explored the use of Agent-Based Modeling (ABM)-a methodological approach in which computer-generated artificial societies simulate human sexual networks-to assess the influence of heterogeneity of sexual motivation on the risk of contracting HIV. The models successfully simulated some characteristics of human sexual systems, such as the relationship between individual differences in sexual motivation (sexual excitation and inhibition) and sexual risk, but failed to reproduce the scale-free distribution of number of partners observed in the real world. ABM has the potential to inform intervention strategies that target the interaction between an individual and his or her social environment.
Bańkowski, Robert; Wiadrowska, Bozena; Beresińska, Martyna; Ludwicki, Jan K; Noworyta-Głowacka, Justyna; Godyń, Artur; Doruchowski, Grzegorz; Hołownicki, Ryszard
2013-01-01
Faulty but still operating agricultural pesticide sprayers may pose an unacceptable health risk for operators. The computerized models designed to calculate exposure and risk for pesticide sprayers used as an aid in the evaluation and further authorisation of plant protection products may be applied also to assess a health risk for operators when faulty sprayers are used. To evaluate the impact of different exposure scenarios on the health risk for the operators using faulty agricultural spraying equipment by means of computer modelling. The exposure modelling was performed for 15 pesticides (5 insecticides, 7 fungicides and 3 herbicides). The critical parameter, i.e. toxicological end-point, on which the risk assessment was based was the no observable adverse effect level (NOAEL). This enabled risk to be estimated under various exposure conditions such as pesticide concentration in the plant protection product and type of the sprayed crop as well as the number of treatments. Computer modelling was based on the UK POEM model including determination of the acceptable operator exposure level (AOEL). Thus the degree of operator exposure could be defined during pesticide treatment whether or not personal protection equipment had been employed by individuals. Data used for computer modelling was obtained from simulated, pesticide substitute treatments using variously damaged knapsack sprayers. These substitute preparations consisted of markers that allowed computer simulations to be made, analogous to real-life exposure situations, in a dose dependent fashion. Exposures were estimated according to operator dosimetry exposure under 'field' conditions for low level, medium and high target field crops. The exposure modelling in the high target field crops demonstrated exceedance of the AOEL in all simulated treatment cases (100%) using damaged sprayers irrespective of the type of damage or if individual protective measures had been adopted or not. For low level and medium field crops exceedances ranged between 40 - 80% cases. The computer modelling may be considered as an practical tool for the hazard assessment when the faulty agricultural sprayers are used. It also may be applied for programming the quality checks and maintenance systems of this equipment.
Human-simulation-based learning to prevent medication error: A systematic review.
Sarfati, Laura; Ranchon, Florence; Vantard, Nicolas; Schwiertz, Vérane; Larbre, Virginie; Parat, Stéphanie; Faudel, Amélie; Rioufol, Catherine
2018-01-31
In the past 2 decades, there has been an increasing interest in simulation-based learning programs to prevent medication error (ME). To improve knowledge, skills, and attitudes in prescribers, nurses, and pharmaceutical staff, these methods enable training without directly involving patients. However, best practices for simulation for healthcare providers are as yet undefined. By analysing the current state of experience in the field, the present review aims to assess whether human simulation in healthcare helps to reduce ME. A systematic review was conducted on Medline from 2000 to June 2015, associating the terms "Patient Simulation," "Medication Errors," and "Simulation Healthcare." Reports of technology-based simulation were excluded, to focus exclusively on human simulation in nontechnical skills learning. Twenty-one studies assessing simulation-based learning programs were selected, focusing on pharmacy, medicine or nursing students, or concerning programs aimed at reducing administration or preparation errors, managing crises, or learning communication skills for healthcare professionals. The studies varied in design, methodology, and assessment criteria. Few demonstrated that simulation was more effective than didactic learning in reducing ME. This review highlights a lack of long-term assessment and real-life extrapolation, with limited scenarios and participant samples. These various experiences, however, help in identifying the key elements required for an effective human simulation-based learning program for ME prevention: ie, scenario design, debriefing, and perception assessment. The performance of these programs depends on their ability to reflect reality and on professional guidance. Properly regulated simulation is a good way to train staff in events that happen only exceptionally, as well as in standard daily activities. By integrating human factors, simulation seems to be effective in preventing iatrogenic risk related to ME, if the program is well designed. © 2018 John Wiley & Sons, Ltd.
A radiation-free mixed-reality training environment and assessment concept for C-arm-based surgery.
Stefan, Philipp; Habert, Séverine; Winkler, Alexander; Lazarovici, Marc; Fürmetz, Julian; Eck, Ulrich; Navab, Nassir
2018-06-25
The discrepancy of continuously decreasing opportunities for clinical training and assessment and the increasing complexity of interventions in surgery has led to the development of different training and assessment options like anatomical models, computer-based simulators or cadaver trainings. However, trainees, following training, assessment and ultimately performing patient treatment, still face a steep learning curve. To address this problem for C-arm-based surgery, we introduce a realistic radiation-free simulation system that combines patient-based 3D printed anatomy and simulated X-ray imaging using a physical C-arm. To explore the fidelity and usefulness of the proposed mixed-reality system for training and assessment, we conducted a user study with six surgical experts performing a facet joint injection on the simulator. In a technical evaluation, we show that our system simulates X-ray images accurately with an RMSE of 1.85 mm compared to real X-ray imaging. The participants expressed agreement with the overall realism of the simulation, the usefulness of the system for assessment and strong agreement with the usefulness of such a mixed-reality system for training of novices and experts. In a quantitative analysis, we furthermore evaluated the suitability of the system for the assessment of surgical skills and gather preliminary evidence for validity. The proposed mixed-reality simulation system facilitates a transition to C-arm-based surgery and has the potential to complement or even replace large parts of cadaver training, to provide a safe assessment environment and to reduce the risk for errors when proceeding to patient treatment. We propose an assessment concept and outline the steps necessary to expand the system into a test instrument that provides reliable and justified assessments scores indicative of surgical proficiency with sufficient evidence for validity.
Accurately quantifying human exposures and doses of various populations to environmental pollutants is critical for the Agency to assess and manage human health risks. For example, the Food Quality Protection Act of 1996 (FQPA) requires EPA to consider aggregate human exposure ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pentz, David L.; Stoll, Ralph H.; Greeves, John T.
2012-07-01
The PRISM (Prioritization Risk Integration Simulation Model), a computer model was developed to support the Department of Energy's Office of Environmental Management (DOE-EM) in its mission to clean up the environmental legacy from the Nation's nuclear weapons materials production complex. PRISM provides a comprehensive, fully integrated planning tool that can tie together DOE-EM's projects. It is designed to help DOE managers develop sound, risk-informed business practices and defend program decisions. It provides a better ability to understand and manage programmatic risks. The underlying concept for PRISM is that DOE-EM 'owns' a portfolio of environmental legacy obligations (ELOs), and that itsmore » mission is to transform the ELOs from their current conditions to acceptable conditions, in the most effective way possible. There are many types of ELOs - - contaminated soils and groundwater plumes, disused facilities awaiting D and D, and various types of wastes waiting for processing or disposal. For a given suite of planned activities, PRISM simulates the outcomes as they play out over time, allowing for all key identified uncertainties and risk factors. Each contaminated building, land area and waste stream is tracked from cradle to grave, and all of the linkages affecting different waste streams are captured. The progression of the activities is fully dynamic, reflecting DOE-EM's prioritization approaches, precedence requirements, available funding, and the consequences of risks and uncertainties. The top level of PRISM is the end-user interface that allows rapid evaluation of alternative scenarios and viewing the results in a variety of useful ways. PRISM is a fully probabilistic model, allowing the user to specify uncertainties in input data (such as the magnitude of an existing groundwater plume, or the total cost to complete a planned activity) as well as specific risk events that might occur. PRISM is based on the GoldSim software that is widely used for risk and performance assessment calculations. PRISM can be run in a deterministic mode, which quickly provides an estimate of the most likely results of a given plan. Alternatively, the model can be run probabilistically in a Monte Carlo mode, exploring the risks and uncertainties in the system and producing probability distributions for the different performance measures. The PRISM model demonstrates how EM can evaluate a portfolio of ELOs, and transform the ELOs from their current conditions to acceptable conditions, utilizing different strategic approaches. There are many types of ELOs - contaminated soils and groundwater plumes, disused facilities awaiting D and D, and various types of wastes waiting for processing or disposal. This scope of work for the PRISM process and the development of a dynamic simulation model are a logical extension of the GoldSim simulation software used by the OCRWM to assess the long-term performance for the Yucca Mountain Project and by NNSA to assess project risk at its sites. Systems integration modeling will promote better understanding of all project risks, technical and nontechnical, and more defensible decision-making for complex projects with significant uncertainties. It can provide effective visual communication and rapid adaptation during interactions with stakeholders (Administration, Congress, State, Local, and NGO). It will also allow rapid assessment of alternative management approaches. (authors)« less
Stanton, Neville A; Harvey, Catherine
2017-02-01
Risk assessments in Sociotechnical Systems (STS) tend to be based on error taxonomies, yet the term 'human error' does not sit easily with STS theories and concepts. A new break-link approach was proposed as an alternative risk assessment paradigm to reveal the effect of information communication failures between agents and tasks on the entire STS. A case study of the training of a Royal Navy crew detecting a low flying Hawk (simulating a sea-skimming missile) is presented using EAST to model the Hawk-Frigate STS in terms of social, information and task networks. By breaking 19 social links and 12 task links, 137 potential risks were identified. Discoveries included revealing the effect of risk moving around the system; reducing the risks to the Hawk increased the risks to the Frigate. Future research should examine the effects of compounded information communication failures on STS performance. Practitioner Summary: The paper presents a step-by-step walk-through of EAST to show how it can be used for risk assessment in sociotechnical systems. The 'broken-links' method takes a systemic, rather than taxonomic, approach to identify information communication failures in social and task networks.
Assessing the Ability of a VR-Based Assembly Task Simulation to Evaluate Physical Risk Factors.
Pontonnier, Charles; Samani, Afshin; Badawi, Marwan; Madeleine, Pascal; Dumont, Georges
2014-05-01
Nowadays the process of workstation design tends to include assessment steps in a virtual environment (VE) to evaluate the ergonomic features. These approaches are cost-effective and convenient since working directly on the digital mock-up in a VE is preferable to constructing a real physical mock-up in a real environment (RE). This study aimed at understanding the ability of a VR-based assembly tasks simulator to evaluate physical risk factors in ergonomics. Sixteen subjects performed simplified assembly tasks in RE and VE. Motion of the upper body and five muscle electromyographic activities were recorded to compute normalized and averaged objective indicators of discomfort, that is, rapid upper limb assessment score, averaged muscle activations, and total task time. Rated perceived exertion (RPE) and a questionnaire were used as subjective indicators of discomfort. The timing regime and complexity of the assembly tasks were investigated as within-subject factors. The results revealed significant differences between measured indicators in RE and VE. While objective measures indicated lower activity and exposure in VE, the subjects experienced more discomfort than in RE. Fairly good correlation levels were found between RE and VE for six of the objective indicators. This study clearly demonstrates that ergonomic studies of assembly tasks using VR are still challenging. Indeed, objective and subjective measurements of discomfort that are usually used in ergonomics to minimize the risks of work-related musculoskeletal disorders development exhibit opposite trends in RE and VE. Nevertheless, the high level of correlation found during this study indicates that the VR-based simulator can be used for such assessments.
Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin
2015-02-01
When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Dentoni, Marta; Deidda, Roberto; Paniconi, Claudio; Marrocu, Marino; Lecca, Giuditta
2014-05-01
Seawater intrusion (SWI) has become a major threat to coastal freshwater resources, particularly in the Mediterranean basin, where this problem is exacerbated by the lack of appropriate groundwater resources management and with serious potential impacts from projected climate changes. A proper analysis and risk assessment that includes climate scenarios is essential for the design of water management measures to mitigate the environmental and socio-economic impacts of SWI. In this study a methodology for SWI risk analysis in coastal aquifers is developed and applied to the Gaza Strip coastal aquifer in Palestine. The method is based on the origin-pathway-target model, evaluating the final value of SWI risk by applying the overlay principle to the hazard map (representing the origin of SWI), the vulnerability map (representing the pathway of groundwater flow) and the elements map (representing the target of SWI). Results indicate the important role of groundwater simulation in SWI risk assessment and illustrate how mitigation measures can be developed according to predefined criteria to arrive at quantifiable expected benefits. Keywords: Climate change, coastal aquifer, seawater intrusion, risk analysis, simulation/optimization model. Acknowledgements. The study is partially funded by the project "Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB)", FP7-ENV-2009-1, GA 244151.
NASA Astrophysics Data System (ADS)
Salis, M.; Ager, A.; Arca, B.; Finney, M.; Bacciu, V. M.; Spano, D.; Duce, P.
2012-12-01
Spatial and temporal patterns of fire spread and behavior are dependent on interactions among climate, topography, vegetation and fire suppression efforts (Pyne et al. 1996; Viegas 2006; Falk et al. 2007). Humans also play a key role in determining frequency and spatial distribution of ignitions (Bar Massada et al, 2011), and thus influence fire regimes as well. The growing incidence of catastrophic wildfires has led to substantial losses for important ecological and human values within many areas of the Mediterranean basin (Moreno et al. 1998; Mouillot et al. 2005; Viegas et al. 2006a; Riaño et al. 2007). The growing fire risk issue has led to many new programs and policies of fuel management and risk mitigation by environmental and fire agencies. However, risk-based methodologies to help identify areas characterized by high potential losses and prioritize fuel management have been lacking for the region. Formal risk assessment requires the joint consideration of likelihood, intensity, and susceptibility, the product of which estimates the chance of a specific loss (Brillinger 2003; Society of Risk Analysis, 2006). Quantifying fire risk therefore requires estimates of a) the probability of a specific location burning at a specific intensity and location, and b) the resulting change in financial or ecological value (Finney 2005; Scott 2006). When large fires are the primary cause of damage, the application of this risk formulation requires modeling fire spread to capture landscape properties that affect burn probability. Recently, the incorporation of large fire spread into risk assessment systems has become feasible with the development of high performance fire simulation systems (Finney et al. 2011) that permit the simulation of hundreds of thousands of fires to generate fine scale maps of burn probability, flame length, and fire size, while considering the combined effects of weather, fuels, and topography (Finney 2002; Andrews et al. 2007; Ager and Finney 2009; Finney et al. 2009; Salis et al. 2012 accepted). In this work, we employed wildfire simulation methods to quantify wildfire exposure to human and ecological values for the island of Sardinia, Italy. The work was focused on the risk and exposure posed by large fires (e.g. 100 - 10,000 ha), and considers historical weather, ignition patterns and fuels. We simulated 100,000 fires using burn periods that replicated the historical size distribution on the Island, and an ignition probability grid derived from historic ignition data. We then examine spatial variation in three exposure components (burn probability, flame length, fire size) among important human and ecological values. The results allowed us to contract exposure among and within the various features examined, and highlighted the importance of human factors in shaping wildfire exposure in Sardinia. The work represents the first application of burn probability modeling in the Mediterranean region, and sets the stage for expanded work in the region to quantify risk from large fires
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-05-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-04-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M
2017-05-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
County-Level Climate Uncertainty for Risk Assessments: Volume 25 Appendix X - Forecast Sea Ice Age.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-05-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
County-Level Climate Uncertainty for Risk Assessments: Volume 27 Appendix Z - Forecast Ridging Rate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconom ic impacts. The full report is contained in 27 volumes.« less
County-Level Climate Uncertainty for Risk Assessments: Volume 17 Appendix P - Forecast Soil Moisture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-05-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
County-Level Climate Uncertainty for Risk Assessments: Volume 1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less
Determining procedures for simulation-based training in radiology: a nationwide needs assessment.
Nayahangan, Leizl Joy; Nielsen, Kristina Rue; Albrecht-Beste, Elisabeth; Bachmann Nielsen, Michael; Paltved, Charlotte; Lindorff-Larsen, Karen Gilboe; Nielsen, Bjørn Ulrik; Konge, Lars
2018-06-01
New training modalities such as simulation are widely accepted in radiology; however, development of effective simulation-based training programs is challenging. They are often unstructured and based on convenience or coincidence. The study objective was to perform a nationwide needs assessment to identify and prioritize technical procedures that should be included in a simulation-based curriculum. A needs assessment using the Delphi method was completed among 91 key leaders in radiology. Round 1 identified technical procedures that radiologists should learn. Round 2 explored frequency of procedure, number of radiologists performing the procedure, risk and/or discomfort for patients, and feasibility for simulation. Round 3 was elimination and prioritization of procedures. Response rates were 67 %, 70 % and 66 %, respectively. In Round 1, 22 technical procedures were included. Round 2 resulted in pre-prioritization of procedures. In round 3, 13 procedures were included in the final prioritized list. The three highly prioritized procedures were ultrasound-guided (US) histological biopsy and fine-needle aspiration, US-guided needle puncture and catheter drainage, and basic abdominal ultrasound. A needs assessment identified and prioritized 13 technical procedures to include in a simulation-based curriculum. The list may be used as guide for development of training programs. • Simulation-based training can supplement training on patients in radiology. • Development of simulation-based training should follow a structured approach. • The CAMES Needs Assessment Formula explores needs for simulation training. • A national Delphi study identified and prioritized procedures suitable for simulation training. • The prioritized list serves as guide for development of courses in radiology.
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Development of optimization-based probabilistic earthquake scenarios for the city of Tehran
NASA Astrophysics Data System (ADS)
Zolfaghari, M. R.; Peyghaleh, E.
2016-01-01
This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.
Assessing the potential of economic instruments for managing drought risk at river basin scale
NASA Astrophysics Data System (ADS)
Pulido-Velazquez, M.; Lopez-Nicolas, A.; Macian-Sorribes, H.
2015-12-01
Economic instruments work as incentives to adapt individual decisions to collectively agreed goals. Different types of economic instruments have been applied to manage water resources, such as water-related taxes and charges (water pricing, environmental taxes, etc.), subsidies, markets or voluntary agreements. Hydroeconomic models (HEM) provide useful insight on optimal strategies for coping with droughts by simultaneously analysing engineering, hydrology and economics of water resources management. We use HEMs for evaluating the potential of economic instruments on managing drought risk at river basin scale, considering three criteria for assessing drought risk: reliability, resilience and vulnerability. HEMs allow to calculate water scarcity costs as the economic losses due to water deliveries below the target demands, which can be used as a vulnerability descriptor of drought risk. Two generic hydroeconomic DSS tools, SIMGAMS and OPTIGAMS ( both programmed in GAMS) have been developed to evaluate water scarcity cost at river basin scale based on simulation and optimization approaches. The simulation tool SIMGAMS allocates water according to the system priorities and operating rules, and evaluate the scarcity costs using economic demand functions. The optimization tool allocates water resources for maximizing net benefits (minimizing total water scarcity plus operating cost of water use). SIMGAS allows to simulate incentive water pricing policies based on water availability in the system (scarcity pricing), while OPTIGAMS is used to simulate the effect of ideal water markets by economic optimization. These tools have been applied to the Jucar river system (Spain), highly regulated and with high share of water use for crop irrigation (greater than 80%), where water scarcity, irregular hydrology and groundwater overdraft cause droughts to have significant economic, social and environmental consequences. An econometric model was first used to explain the variation of the production value of irrigated agriculture during droughts, assessing revenue responses to varying crop prices and water availability. Hydroeconomic approaches were then used to show the potential of economic instruments in setting incentives for a more efficient management of water resources systems.
A Corrosion Risk Assessment Model for Underground Piping
NASA Technical Reports Server (NTRS)
Datta, Koushik; Fraser, Douglas R.
2009-01-01
The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.
NASA Technical Reports Server (NTRS)
Pocinki, L. S.; Kaplan, L. D.; Cornell, M. E.; Greenstone, R.
1979-01-01
A model was developed to generate quantitative estimates of the risk associated with the release of graphite fibers during fires involving commercial aircraft constructed with graphite fiber composite materials. The model was used to estimate the risk associated with accidents at several U.S. airports. These results were then combined to provide an estimate of the total risk to the nation.
Morita, M
2011-01-01
Global climate change is expected to affect future rainfall patterns. These changes should be taken into account when assessing future flooding risks. This study presents a method for quantifying the increase in flood risk caused by global climate change for use in urban flood risk management. Flood risk in this context is defined as the product of flood damage potential and the probability of its occurrence. The study uses a geographic information system-based flood damage prediction model to calculate the flood damage caused by design storms with different return periods. Estimation of the monetary damages these storms produce and their return periods are precursors to flood risk calculations. The design storms are developed from modified intensity-duration-frequency relationships generated by simulations of global climate change scenarios (e.g. CGCM2A2). The risk assessment method is applied to the Kanda River basin in Tokyo, Japan. The assessment provides insights not only into the flood risk cost increase due to global warming, and the impact that increase may have on flood control infrastructure planning.
Simulation based planning of surgical interventions in pediatric cardiology
NASA Astrophysics Data System (ADS)
Marsden, Alison L.
2013-10-01
Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.
Land Cover as a Framework For Assessing the Risk of Water Pollution
James D. Wickham; Kurt H. Riitters; Robert V. O' Neill; Kenneth H. Reckhow; Timothy G. Wade; K. Bruce Jones
2000-01-01
A survey of numerous field studies shows that nitrogen and phosphorous export coefficients are significantly different across forest, agriculture, and urban land-cover types. We used simulations to estimate the land-cover composition at which there was a significant risk of nutrient loads representative of watersheds without forest cover. The results suggest that at...
Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A
2017-11-06
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.
A Hydrological Modeling Framework for Flood Risk Assessment for Japan
NASA Astrophysics Data System (ADS)
Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.
2016-12-01
Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.
Moisture Durability Assessment of Selected Well-insulated Wall Assemblies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pallin, Simon B.; Boudreaux, Philip R.; Kehrer, Manfred
2015-12-01
This report presents the results from studying the hygrothermal performance of two well-insulated wall assemblies, both complying with and exceeding international building codes (IECC 2015 2014, IRC 2015). The hygrothermal performance of walls is affected by a large number of influential parameters (e.g., outdoor and indoor climates, workmanship, material properties). This study was based on a probabilistic risk assessment in which a number of these influential parameters were simulated with their natural variability. The purpose of this approach was to generate simulation results based on laboratory chamber measurements that represent a variety of performances and thus better mimic realistic conditions.more » In total, laboratory measurements and 6,000 simulations were completed for five different US climate zones. A mold growth indicator (MGI) was used to estimate the risk of mold which potentially can cause moisture durability problems in the selected wall assemblies. Analyzing the possible impact on the indoor climate due to mold was not part of this study. The following conclusions can be reached from analyzing the simulation results. In a hot-humid climate, a higher R-value increases the importance of the airtightness because interior wall materials are at lower temperatures. In a cold climate, indoor humidity levels increase with increased airtightness. Air leakage must be considered in a hygrothermal risk assessment, since air efficiently brings moisture into buildings from either the interior or exterior environment. The sensitivity analysis of this study identifies mitigation strategies. Again, it is important to remark that MGI is an indicator of mold, not an indicator of indoor air quality and that mold is the most conservative indicator for moisture durability issues.« less
Neves, Natália Rust; Oliva, Marco Antonio; da Cruz Centeno, Danilo; Costa, Alan Carlos; Ribas, Rogério Ferreira; Pereira, Eduardo Gusmão
2009-06-01
The Brazilian sandy coastal plain named restinga is frequently subjected to particulate and gaseous emissions from iron ore factories. These gases may come into contact with atmospheric moisture and produce acid rain. The effects of the acid rain on vegetation, combined with iron excess in the soil, can lead to the disappearance of sensitive species and decrease restinga biodiversity. The effects of iron ore dust deposition and simulated acid rain on photosynthesis and on antioxidant enzymes were investigated in Eugenia uniflora, a representative shrub species of the restinga. This study aimed to determine the possible utility of this species in environmental risk assessment. After the application of iron ore dust as iron solid particulate matter (SPM(Fe)) and simulated acid rain (pH 3.1), the 18-month old plants displayed brown spots and necrosis, typical symptoms of iron toxicity and injuries caused by acid rain, respectively. The acidity of the rain intensified leaf iron accumulation, which reached phytotoxic levels, mainly in plants exposed to iron ore dust. These plants showed the lowest values for net photosynthesis, stomatal conductance, transpiration, chlorophyll a content and electron transport rate through photosystem II (PSII). Catalase and superoxide dismutase activities were decreased by simulated acid rain. Peroxidase activity and membrane injury increased following exposure to acid rain and simultaneous SPM(Fe) application. Eugenia uniflora exhibited impaired photosynthetic and antioxidative metabolism in response to combined iron and acid rain stresses. This species could become a valuable tool in environmental risk assessment in restinga areas near iron ore pelletizing factories. Non-invasive evaluations of visual injuries, photosynthesis and chlorophyll a fluorescence, as well as invasive biochemical analysis could be used as markers.
Speciation and bioaccessibility of mercury in adobe bricks and dirt floors in Huancavelica, Peru.
Hagan, Nicole; Robins, Nicholas; Gonzales, Ruben Dario Espinoza; Hsu-Kim, Heileen
2015-04-01
Huancavelica, Peru, a historic cinnabar refining site, is one of the most mercury (Hg)-contaminated urban areas in the world. Exposure is amplified because residents build their adobe brick homes from contaminated soil. The objectives of this study were to compare two Hg-leaching procedures, and their application as risk-assessment screening tools in Hg-contaminated adobe brick homes in Huancavelica. The purpose was to evaluate potential health implications, particularly for children, after ingestion of Hg-contaminated particles. Hg was measured in adobe brick and dirt floor samples from 60 households by total Hg extraction, simulated gastric fluid (GF) extraction, and sequential selective extraction (SSE), which provides more detailed data but is resource-intensive. Most of the Hg present in samples was relatively insoluble, although in some households soluble Hg species were present at concentrations that may be of concern after ingestion. A strong correlation was identified between results from simulated GF extraction of adobe bricks and dirt floors and the more soluble fractions of Hg from SSE. Simulated GF extraction data were combined with ingestion and body mass characteristics for small children to compare potential risk of ingestion of Hg-contaminated soil with current health standards. Simulated GF extraction can be used as a risk assessment screening tool for effective allocation of time and resources to households that have measurable concentrations of bioaccessible Hg. Combining simulated GF extraction data with health standards enables intervention strategies targeted at households with the greatest potential health threat from ingestion of Hg-contaminated particles.
Calculation of out-of-field dose distribution in carbon-ion radiotherapy by Monte Carlo simulation.
Yonai, Shunsuke; Matsufuji, Naruhiro; Namba, Masao
2012-08-01
Recent radiotherapy technologies including carbon-ion radiotherapy can improve the dose concentration in the target volume, thereby not only reducing side effects in organs at risk but also the secondary cancer risk within or near the irradiation field. However, secondary cancer risk in the low-dose region is considered to be non-negligible, especially for younger patients. To achieve a dose estimation of the whole body of each patient receiving carbon-ion radiotherapy, which is essential for risk assessment and epidemiological studies, Monte Carlo simulation plays an important role because the treatment planning system can provide dose distribution only in∕near the irradiation field and the measured data are limited. However, validation of Monte Carlo simulations is necessary. The primary purpose of this study was to establish a calculation method using the Monte Carlo code to estimate the dose and quality factor in the body and to validate the proposed method by comparison with experimental data. Furthermore, we show the distributions of dose equivalent in a phantom and identify the partial contribution of each radiation type. We proposed a calculation method based on a Monte Carlo simulation using the PHITS code to estimate absorbed dose, dose equivalent, and dose-averaged quality factor by using the Q(L)-L relationship based on the ICRP 60 recommendation. The values obtained by this method in modeling the passive beam line at the Heavy-Ion Medical Accelerator in Chiba were compared with our previously measured data. It was shown that our calculation model can estimate the measured value within a factor of 2, which included not only the uncertainty of this calculation method but also those regarding the assumptions of the geometrical modeling and the PHITS code. Also, we showed the differences in the doses and the partial contributions of each radiation type between passive and active carbon-ion beams using this calculation method. These results indicated that it is essentially important to include the dose by secondary neutrons in the assessment of the secondary cancer risk of patients receiving carbon-ion radiotherapy with active as well as passive beams. We established a calculation method with a Monte Carlo simulation to estimate the distribution of dose equivalent in the body as a first step toward routine risk assessment and an epidemiological study of carbon-ion radiotherapy at NIRS. This method has the advantage of being verifiable by the measurement.
Cashin, Cheryl; Phuong, Nguyen Khanh; Shain, Ryan; Oanh, Tran Thi Mai; Thuy, Nguyen Thi
2015-01-01
Vietnam is currently considering a revision of its 2008 Health Insurance Law, including the regulation of provider payment methods. This study uses a simple spreadsheet-based, micro-simulation model to analyse the potential impacts of different provider payment reform scenarios on resource allocation across health care providers in three provinces in Vietnam, as well as on the total expenditure of the provincial branches of the public health insurance agency (Provincial Social Security [PSS]). The results show that currently more than 50% of PSS spending is concentrated at the provincial level with less than half at the district level. There is also a high degree of financial risk on district hospitals with the current fund-holding arrangement. Results of the simulation model show that several alternative scenarios for provider payment reform could improve the current payment system by reducing the high financial risk currently borne by district hospitals without dramatically shifting the current level and distribution of PSS expenditure. The results of the simulation analysis provided an empirical basis for health policy-makers in Vietnam to assess different provider payment reform options and make decisions about new models to support health system objectives.
Monte Carlo Simulation of Spacecraft Particle Detectors to Assess the True Human Risk
NASA Technical Reports Server (NTRS)
O'Neill, Patrick M.
2002-01-01
Particle detectors (DOSTEL, CPDS, and TEPC) measure the energy deposition spectrum inside earth orbiting - manned spacecraft (shuttle, space station). These instruments attempt to emulate the deposition of energy in human tissue to evaluate the health risk. However, the measurements are often difficult to relate to tissue equivalent because nuclear fragmentation (internuclear cascade/evaporation), energy-loss straggling, heavy ions, spacecraft shielding and detector geometry/orientation, and coincidence thresholds significantly affect the measured spectrum. 'A le have developed a high fidelity Monte Carlo model addressing each of these effects that significantly improves interpretation of these instruments and the resulting assessment of radiation risk to humans.
NASA Astrophysics Data System (ADS)
Doroszkiewicz, Joanna; Romanowicz, Renata
2016-04-01
Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the project CHIHE (Climate Change Impact on Hydrological Extremes), carried out in the Institute of Geophysics Polish Academy of Sciences, funded by Norway Grants (contract No. Pol-Nor/196243/80/2013). The hydro-meteorological observations were provided by the Institute of Meteorology and Water Management (IMGW), Poland.
Cognitive deficits are associated with poorer simulated driving in older adults with heart failure
2013-01-01
Background Cognitive impairment is prevalent in older adults with heart failure (HF) and associated with reduced functional independence. HF patients appear at risk for reduced driving ability, as past work in other medical samples has shown cognitive dysfunction to be an important contributor to driving performance. The current study examined whether cognitive dysfunction was independently associated with reduced driving simulation performance in a sample of HF patients. Methods 18 persons with HF (67.72; SD = 8.56 year) completed echocardiogram and a brief neuropsychological test battery assessing global cognitive function, attention/executive function, memory and motor function. All participants then completed the Kent Multidimensional Assessment Driving Simulation (K-MADS), a driving simulator scenario with good psychometric properties. Results The sample exhibited an average Mini Mental State Examination (MMSE) score of 27.83 (SD = 2.09). Independent sample t-tests showed that HF patients performed worse than healthy adults on the driving simulation scenario. Finally, partial correlations showed worse attention/executive and motor function were independently associated with poorer driving simulation performance across several indices reflective of driving ability (i.e., centerline crossings, number of collisions, % of time over the speed limit, among others). Conclusion The current findings showed that reduced cognitive function was associated with poor simulated driving performance in older adults with HF. If replicated using behind-the-wheel testing, HF patients may be at elevated risk for unsafe driving and routine driving evaluations in this population may be warranted. PMID:24499466
Models, Tools, and Databases for Land and Waste Management Research
These publicly available resources can be used for such tasks as simulating biodegradation or remediation of contaminants such as hydrocarbons, measuring sediment accumulation at superfund sites, or assessing toxicity and risk.
Estimating the concordance probability in a survival analysis with a discrete number of risk groups.
Heller, Glenn; Mo, Qianxing
2016-04-01
A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.
Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei
2011-12-01
A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.
Nuclear Power Plant Cyber Security Discrete Dynamic Event Tree Analysis (LDRD 17-0958) FY17 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wheeler, Timothy A.; Denman, Matthew R.; Williams, R. A.
Instrumentation and control of nuclear power is transforming from analog to modern digital assets. These control systems perform key safety and security functions. This transformation is occurring in new plant designs as well as in the existing fleet of plants as the operation of those plants is extended to 60 years. This transformation introduces new and unknown issues involving both digital asset induced safety issues and security issues. Traditional nuclear power risk assessment tools and cyber security assessment methods have not been modified or developed to address the unique nature of cyber failure modes and of cyber security threat vulnerabilities.more » iii This Lab-Directed Research and Development project has developed a dynamic cyber-risk in- formed tool to facilitate the analysis of unique cyber failure modes and the time sequencing of cyber faults, both malicious and non-malicious, and impose those cyber exploits and cyber faults onto a nuclear power plant accident sequence simulator code to assess how cyber exploits and cyber faults could interact with a plants digital instrumentation and control (DI&C) system and defeat or circumvent a plants cyber security controls. This was achieved by coupling an existing Sandia National Laboratories nuclear accident dynamic simulator code with a cyber emulytics code to demonstrate real-time simulation of cyber exploits and their impact on automatic DI&C responses. Studying such potential time-sequenced cyber-attacks and their risks (i.e., the associated impact and the associated degree of difficulty to achieve the attack vector) on accident management establishes a technical risk informed framework for developing effective cyber security controls for nuclear power.« less
NASA Astrophysics Data System (ADS)
Moya, J. L.; Skocypec, R. D.; Thomas, R. K.
1993-09-01
Over the past 40 years, Sandia National Laboratories (SNL) has been actively engaged in research to improve the ability to accurately predict the response of engineered systems to abnormal thermal and structural environments. These engineered systems contain very hazardous materials. Assessing the degree of safety/risk afforded the public and environment by these engineered systems, therefore, is of upmost importance. The ability to accurately predict the response of these systems to accidents (to abnormal environments) is required to assess the degree of safety. Before the effect of the abnormal environment on these systems can be determined, it is necessary to ascertain the nature of the environment. Ascertaining the nature of the environment, in turn, requires the ability to physically characterize and numerically simulate the abnormal environment. Historically, SNL has demonstrated the level of safety provided by these engineered systems by either of two approaches: a purely regulatory approach, or by a probabilistic risk assessment (PRA). This paper will address the latter of the two approaches.
Advanced risk assessment of the effects of graphite fibers on electronic and electric equipment
NASA Technical Reports Server (NTRS)
Pocinki, L.; Cornell, M.; Kaplan, L.
1980-01-01
An assessment of the risk associated with accidents involving aircraft with carbon fiber composite structural components is examined. The individual fiber segments cause electrical and electronic equipment to fail under certain operating conditions. A Monte Carlo simulation model was used to computer the risk. Aircraft accidents with fire, release of carbon fiber material, entrainment of carbon fibers in a smoke plume transport of fibers downwind, transfer of some fibers/into the the interior of buildings, failures of electrical and electronic equipment, and economic impact of failures are discussed. Risk profiles were prepared for individual airports and the Nation. The vulnerability of electrical transmission equipment to carbon fiber incursion and aircraft accident total costs is investigated.
Mitchell, Dominic; Guertin, Jason R; Dubois, Anick; Dubé, Marie-Pierre; Tardif, Jean-Claude; Iliza, Ange Christelle; Fanton-Aita, Fiorella; Matteau, Alexis; LeLorier, Jacques
2018-04-01
Statin (HMG-CoA reductase inhibitor) therapy is the mainstay dyslipidemia treatment and reduces the risk of a cardiovascular (CV) event (CVE) by up to 35%. However, adherence to statin therapy is poor. One reason patients discontinue statin therapy is musculoskeletal pain and the associated risk of rhabdomyolysis. Research is ongoing to develop a pharmacogenomics (PGx) test for statin-induced myopathy as an alternative to the current diagnosis method, which relies on creatine kinase levels. The potential economic value of a PGx test for statin-induced myopathy is unknown. We developed a lifetime discrete event simulation (DES) model for patients 65 years of age initiating a statin after a first CVE consisting of either an acute myocardial infarction (AMI) or a stroke. The model evaluates the potential economic value of a hypothetical PGx test for diagnosing statin-induced myopathy. We have assessed the model over the spectrum of test sensitivity and specificity parameters. Our model showed that a strategy with a perfect PGx test had an incremental cost-utility ratio of 4273 Canadian dollars ($Can) per quality-adjusted life year (QALY). The probabilistic sensitivity analysis shows that when the payer willingness-to-pay per QALY reaches $Can12,000, the PGx strategy is favored in 90% of the model simulations. We found that a strategy favoring patients staying on statin therapy is cost effective even if patients maintained on statin are at risk of rhabdomyolysis. Our results are explained by the fact that statins are highly effective in reducing the CV risk in patients at high CV risk, and this benefit largely outweighs the risk of rhabdomyolysis.
Ma, Irene W Y; Brindle, Mary E; Ronksley, Paul E; Lorenzetti, Diane L; Sauve, Reg S; Ghali, William A
2011-09-01
Central venous catheterization (CVC) is increasingly taught by simulation. The authors reviewed the literature on the effects of simulation training in CVC on learner and clinical outcomes. The authors searched computerized databases (1950 to May 2010), reference lists, and considered studies with a control group (without simulation education intervention). Two independent assessors reviewed the retrieved citations. Independent data abstraction was performed on study design, study quality score, learner characteristics, sample size, components of interventional curriculum, outcomes assessed, and method of assessment. Learner outcomes included performance measures on simulators, knowledge, and confidence. Patient outcomes included number of needle passes, arterial puncture, pneumothorax, and catheter-related infections. Twenty studies were identified. Simulation-based education was associated with significant improvements in learner outcomes: performance on simulators (standardized mean difference [SMD] 0.60 [95% CI 0.45 to 0.76]), knowledge (SMD 0.60 [95% CI 0.35 to 0.84]), and confidence (SMD 0.41 [95% CI 0.30 to 0.53] for studies with single-group pretest and posttest design; SMD 0.52 (95% CI 0.23 to 0.81) for studies with nonrandomized, two-group design). Furthermore, simulation-based education was associated with improved patient outcomes, including fewer needle passes (SMD -0.58 [95% CI -0.95 to -0.20]), and pneumothorax (relative risk 0.62 [95% CI 0.40 to 0.97]), for studies with nonrandomized, two-group design. However, simulation-based training was not associated with a significant reduction in risk of either arterial puncture or catheter-related infections. Despite some limitations in the literature reviewed, evidence suggests that simulation-based education for CVC provides benefits in learner and select clinical outcomes.
Ceacareanu, Alice C; Brown, Geoffrey W; Moussa, Hoda A; Wintrob, Zachary A P
2018-01-01
We aimed to estimate the metformin-associated lactic acidosis (MALA) risk by assessing retrospectively the renal clearance variability and applying a pharmacokinetic (PK) model of metformin clearance in a population diagnosed with acute myeloid leukemia (AML) and diabetes mellitus (DM). All adults with preexisting DM and newly diagnosed AML at Roswell Park Cancer Institute were reviewed (January 2003-December 2010, n = 78). Creatinine clearance (CrCl) and total body weight distributions were used in a two-compartment PK model adapted for multiple dosing and modified to account for actual intra- and inter-individual variability. Based on this renal function variability evidence, 1000 PK profiles were simulated for multiple metformin regimens with the resultant PK profiles being assessed for safe CrCl thresholds. Metformin 500 mg up to three times daily was safe for all simulated profiles with CrCl ≥25 mL/min. Furthermore, the estimated overall MALA risk was below 10%, remaining under 5% for 500 mg given once daily. CrCl ≥65.25 mL/min was safe for administration in any of the tested regimens (500 mg or 850 mg up to three times daily or 1000 mg up to twice daily). PK simulation-guided prescribing can maximize metformin's beneficial effects on cancer outcomes while minimizing MALA risk.
Values of Flood Hazard Mapping for Disaster Risk Assessment and Communication
NASA Astrophysics Data System (ADS)
Sayama, T.; Takara, K. T.
2015-12-01
Flood plains provide tremendous benefits for human settlements. Since olden days people have lived with floods and attempted to control them if necessary. Modern engineering works such as building embankment have enabled people to live even in flood prone areas, and over time population and economic assets have concentrated in these areas. In developing countries also, rapid land use change alters exposure and vulnerability to floods and consequently increases disaster risk. Flood hazard mapping is an essential step for any counter measures. It has various objectives including raising awareness of residents, finding effective evacuation routes and estimating potential damages through flood risk mapping. Depending on the objectives and data availability, there are also many possible approaches for hazard mapping including simulation basis, community basis and remote sensing basis. In addition to traditional paper-based hazard maps, Information and Communication Technology (ICT) promotes more interactive hazard mapping such as movable hazard map to demonstrate scenario simulations for risk communications and real-time hazard mapping for effective disaster responses and safe evacuations. This presentation first summarizes recent advancement of flood hazard mapping by focusing on Japanese experiences and other examples from Asian countries. Then it introduces a flood simulation tool suitable for hazard mapping at the river basin scale even in data limited regions. In the past few years, the tool has been practiced by local officers responsible for disaster management in Asian countries. Through the training activities of hazard mapping and risk assessment, we conduct comparative analysis to identify similarity and uniqueness of estimated economic damages depending on topographic and land use conditions.
Assessing balance through the use of a low-cost head-mounted display in older adults: a pilot study
Saldana, Santiago J; Marsh, Anthony P; Rejeski, W Jack; Haberl, Jack K; Wu, Peggy; Rosenthal, Scott; Ip, Edward H
2017-01-01
Introduction As the population ages, the prevention of falls is an increasingly important public health problem. Balance assessment forms an important component of fall-prevention programs for older adults. The recent development of cost-effective and highly responsive virtual reality (VR) systems means new methods of balance assessment are feasible in a clinical setting. This proof-of-concept study made use of the submillimeter tracking built into modern VR head-mounted displays (VRHMDs) to assess balance through the use of visual–vestibular conflict. The objective of this study was to evaluate the validity, acceptability, and reliability of using a VRHMD to assess balance in older adults. Materials and methods Validity was assessed by comparing measurements from the VRHMD to measurements of postural sway from a force plate. Acceptability was assessed through the use of the Simulator Sickness Questionnaire pre- and postexposure to assess possible side effects of the visual–vestibular conflict. Reliability was assessed by measuring correlations between repeated measurements 1 week apart. Variables of possible importance that were found to be reliable (r≥0.9) between tests separated by a week were then tested for differences compared to a control group. Assessment was performed as a cross-sectional single-site community center-based study in 13 older adults (≥65 years old, 80.2±7.3 years old, 77% female, five at risk of falls, eight controls). The VR balance assessment consisted of four modules: a baseline module, a reaction module, a balance module, and a seated assessment. Results There was a significant difference in the rate at which participants with a risk of falls changed their tilt in the anteroposterior direction compared to the control group. Participants with a risk of falls changed their tilt in the anteroposterior direction at 0.7°/second vs 0.4°/second for those without a history of falls. No significant differences were found between pre/postassessment for oculomotor score or total Simulator Sickness Questionnaire score. Both the force plate and the head-mounted display balance-assessment system were able to detect differences between conditions meant to mask visual and proprioceptive information. Conclusion This VRHMD is both affordable and portable, causes minimal simulator sickness, and produces repeatable results that can be used to assess balance in older adults. PMID:28883717
Assessing balance through the use of a low-cost head-mounted display in older adults: a pilot study.
Saldana, Santiago J; Marsh, Anthony P; Rejeski, W Jack; Haberl, Jack K; Wu, Peggy; Rosenthal, Scott; Ip, Edward H
2017-01-01
As the population ages, the prevention of falls is an increasingly important public health problem. Balance assessment forms an important component of fall-prevention programs for older adults. The recent development of cost-effective and highly responsive virtual reality (VR) systems means new methods of balance assessment are feasible in a clinical setting. This proof-of-concept study made use of the submillimeter tracking built into modern VR head-mounted displays (VRHMDs) to assess balance through the use of visual-vestibular conflict. The objective of this study was to evaluate the validity, acceptability, and reliability of using a VRHMD to assess balance in older adults. Validity was assessed by comparing measurements from the VRHMD to measurements of postural sway from a force plate. Acceptability was assessed through the use of the Simulator Sickness Questionnaire pre- and postexposure to assess possible side effects of the visual-vestibular conflict. Reliability was assessed by measuring correlations between repeated measurements 1 week apart. Variables of possible importance that were found to be reliable ( r ≥0.9) between tests separated by a week were then tested for differences compared to a control group. Assessment was performed as a cross-sectional single-site community center-based study in 13 older adults (≥65 years old, 80.2±7.3 years old, 77% female, five at risk of falls, eight controls). The VR balance assessment consisted of four modules: a baseline module, a reaction module, a balance module, and a seated assessment. There was a significant difference in the rate at which participants with a risk of falls changed their tilt in the anteroposterior direction compared to the control group. Participants with a risk of falls changed their tilt in the anteroposterior direction at 0.7°/second vs 0.4°/second for those without a history of falls. No significant differences were found between pre/postassessment for oculomotor score or total Simulator Sickness Questionnaire score. Both the force plate and the head-mounted display balance-assessment system were able to detect differences between conditions meant to mask visual and proprioceptive information. This VRHMD is both affordable and portable, causes minimal simulator sickness, and produces repeatable results that can be used to assess balance in older adults.
Sustainable and Smart City Planning Using Spatial Data in Wallonia
NASA Astrophysics Data System (ADS)
Stephenne, N.; Beaumont, B.; Hallot, E.; Wolff, E.; Poelmans, L.; Baltus, C.
2016-09-01
Simulating population distribution and land use changes in space and time offer opportunities for smart city planning. It provides a holistic and dynamic vision of fast changing urban environment to policy makers. Impacts, such as environmental and health risks or mobility issues, of policies can be assessed and adapted consequently. In this paper, we suppose that "Smart" city developments should be sustainable, dynamic and participative. This paper addresses these three smart objectives in the context of urban risk assessment in Wallonia, Belgium. The sustainable, dynamic and participative solution includes (i) land cover and land use mapping using remote sensing and GIS, (ii) population density mapping using dasymetric mapping, (iii) predictive modelling of land use changes and population dynamics and (iv) risk assessment. The comprehensive and long-term vision of the territory should help to draw sustainable spatial planning policies, to adapt remote sensing acquisition, to update GIS data and to refine risk assessment from regional to city scale.
Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez
2014-01-01
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. © 2013 Society for Risk Analysis.
Development of Improved Caprock Integrity and Risk Assessment Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruno, Michael
GeoMechanics Technologies has completed a geomechanical caprock integrity analysis and risk assessment study funded through the US Department of Energy. The project included: a detailed review of historical caprock integrity problems experienced in the natural gas storage industry; a theoretical description and documentation of caprock integrity issues; advanced coupled transport flow modelling and geomechanical simulation of three large-scale potential geologic sequestration sites to estimate geomechanical effects from CO₂ injection; development of a quantitative risk and decision analysis tool to assess caprock integrity risks; and, ultimately the development of recommendations and guidelines for caprock characterization and CO₂ injection operating practices. Historicalmore » data from gas storage operations and CO₂ sequestration projects suggest that leakage and containment incident risks are on the order of 10-1 to 10-2, which is higher risk than some previous studies have suggested for CO₂. Geomechanical analysis, as described herein, can be applied to quantify risks and to provide operating guidelines to reduce risks. The risk assessment tool developed for this project has been applied to five areas: The Wilmington Graben offshore Southern California, Kevin Dome in Montana, the Louden Field in Illinois, the Sleipner CO₂ sequestration operation in the North Sea, and the In Salah CO₂ sequestration operation in North Africa. Of these five, the Wilmington Graben area represents the highest relative risk while the Kevin Dome area represents the lowest relative risk.« less
Bertocci, Gina; Souza, Aaron L; Szobota, Stephanie
2003-01-01
Many wheelchair users must travel in motor vehicles while seated in their wheelchairs. The safety features of seat assemblies are key to motor vehicle occupant crash protection. Seating system properties such as strength, stiffness, and energy absorbance have been shown to have significant influence on risk of submarining. This study investigated the effects of wheelchair seat stiffness and energy absorption properties on occupant risk of submarining during a frontal motor vehicle 20 g/30 mph impact using a validated computer crash simulation model. The results indicate that wheelchair-seating stiffness and energy absorption characteristics influence occupant kinematics associated with the risk of submarining. Softer seat surfaces and relatively high energy absorption/permanent deformation were found to produce pelvis excursion trajectories associated with increased submarining risk. Findings also suggest that the current American National Standards Institute/Rehabilitation Engineering and Assistive Technology Society of North America (ANSI/RESNA) WC-19 seating integrity may not adequately assess submarining risk.
A 2D simulation model for urban flood management
NASA Astrophysics Data System (ADS)
Price, Roland; van der Wielen, Jonathan; Velickov, Slavco; Galvao, Diogo
2014-05-01
The European Floods Directive, which came into force on 26 November 2007, requires member states to assess all their water courses and coast lines for risk of flooding, to map flood extents and assets and humans at risk, and to take adequate and coordinated measures to reduce the flood risk in consultation with the public. Flood Risk Management Plans are to be in place by 2015. There are a number of reasons for the promotion of this Directive, not least because there has been much urban and other infrastructural development in flood plains, which puts many at risk of flooding along with vital societal assets. In addition there is growing awareness that the changing climate appears to be inducing more frequent extremes of rainfall with a consequent increases in the frequency of flooding. Thirdly, the growing urban populations in Europe, and especially in the developing countries, means that more people are being put at risk from a greater frequency of urban flooding in particular. There are urgent needs therefore to assess flood risk accurately and consistently, to reduce this risk where it is important to do so or where the benefit is greater than the damage cost, to improve flood forecasting and warning, to provide where necessary (and possible) flood insurance cover, and to involve all stakeholders in decision making affecting flood protection and flood risk management plans. Key data for assessing risk are water levels achieved or forecasted during a flood. Such levels should of course be monitored, but they also need to be predicted, whether for design or simulation. A 2D simulation model (PriceXD) solving the shallow water wave equations is presented specifically for determining flood risk, assessing flood defense schemes and generating flood forecasts and warnings. The simulation model is required to have a number of important properties: -Solve the full shallow water wave equations using a range of possible solutions; -Automatically adjust the time step and keep it as large as possible while maintaining the stability of the flow calculations; -Operate on a square grid at any resolution while retaining at least some details of the ground topography of the basic grid, the storage, and the form roughness and conveyance of the ground surface; -Account for the overall average ground slope for particular coarse cells; -Have the facility to refine the grid locally; -Have the facility to treat ponds or lakes as single, irregular cells; -Permit prescribed inflows and arbitrary outflows across the boundaries of the model domain or internally, and sources and sinks at any interior cell; -Simulate runoff for spatial rainfall while permitting infiltration; -Use ground surface cover and soil type indices to determine surface roughness, interception and infiltration parameters; -Present results at the basic cell level; -Have the facility to begin a model run with monitored soil moisture data; -Have the facility to hot-start a simulation using dumped data from a previous simulation; -Operate with a graphics cards for parallel processing; -Have the facility to link directly to the urban drainage simulation software such as SWMM through an Open Modelling Interface; -Be linked to the Netherlands national rainfall database for continuous simulation of rainfall-runoff for particular polders and urban areas; -Make the engine available as Open Source together with benchmark datasets; PriceXD forms a key modelling component of an integrated urban water management system consisting of an on-line database and a number of complementary modelling systems for urban hydrology, groundwater, potable water distribution, wastewater and stormwater drainage (separate and combined sewerage), wastewater treatment, and surface channel networks. This will be a 'plug and play' system. By linking the models together, confidence in the accuracy of the above-ground damage and construction costs is comparable to the below-ground costs. What is more, PriceXD can be used to examine additional physical phenomenon such as the interaction between flood flows and flows to and from inlets distributed along the pipes of the underground network, and to optimize the removal of blockages and improve asset management. Finally, PriceXD is already an integral component on a number of operational projects and platforms, including the MyWater distributed platform and the HydroNET web portal, where it is already applied to realistic case studies on the Netherlands (namely the Rijnland area), facilitating the access to both the model execution and results, by abstracting most of the complexity out of the model setup and configuration.
Risk assessment by dynamic representation of vulnerability, exploitation, and impact
NASA Astrophysics Data System (ADS)
Cam, Hasan
2015-05-01
Assessing and quantifying cyber risk accurately in real-time is essential to providing security and mission assurance in any system and network. This paper presents a modeling and dynamic analysis approach to assessing cyber risk of a network in real-time by representing dynamically its vulnerabilities, exploitations, and impact using integrated Bayesian network and Markov models. Given the set of vulnerabilities detected by a vulnerability scanner in a network, this paper addresses how its risk can be assessed by estimating in real-time the exploit likelihood and impact of vulnerability exploitation on the network, based on real-time observations and measurements over the network. The dynamic representation of the network in terms of its vulnerabilities, sensor measurements, and observations is constructed dynamically using the integrated Bayesian network and Markov models. The transition rates of outgoing and incoming links of states in hidden Markov models are used in determining exploit likelihood and impact of attacks, whereas emission rates help quantify the attack states of vulnerabilities. Simulation results show the quantification and evolving risk scores over time for individual and aggregated vulnerabilities of a network.
Linking stressors and ecological responses
Gentile, J.H.; Solomon, K.R.; Butcher, J.B.; Harrass, M.; Landis, W.G.; Power, M.; Rattner, B.A.; Warren-Hicks, W.J.; Wenger, R.; Foran, Jeffery A.; Ferenc, Susan A.
1999-01-01
To characterize risk, it is necessary to quantify the linkages and interactions between chemical, physical and biological stressors and endpoints in the conceptual framework for ecological risk assessment (ERA). This can present challenges in a multiple stressor analysis, and it will not always be possible to develop a quantitative stressor-response profile. This review commences with a conceptual representation of the problem of developing a linkage analysis for multiple stressors and responses. The remainder of the review surveys a variety of mathematical and statistical methods (e.g., ranking methods, matrix models, multivariate dose-response for mixtures, indices, visualization, simulation modeling and decision-oriented methods) for accomplishing the linkage analysis for multiple stressors. Describing the relationships between multiple stressors and ecological effects are critical components of 'effects assessment' in the ecological risk assessment framework.
Detection of Orbital Debris Collision Risks for the Automated Transfer Vehicle
NASA Technical Reports Server (NTRS)
Peret, L.; Legendre, P.; Delavault, S.; Martin, T.
2007-01-01
In this paper, we present a general collision risk assessment method, which has been applied through numerical simulations to the Automated Transfer Vehicle (ATV) case. During ATV ascent towards the International Space Station, close approaches between the ATV and objects of the USSTRACOM catalog will be monitored through collision rosk assessment. Usually, collision risk assessment relies on an exclusion volume or a probability threshold method. Probability methods are more effective than exclusion volumes but require accurate covariance data. In this work, we propose to use a criterion defined by an adaptive exclusion area. This criterion does not require any probability calculation but is more effective than exclusion volume methods as demonstrated by our numerical experiments. The results of these studies, when confirmed and finalized, will be used for the ATV operations.
NASA Astrophysics Data System (ADS)
Zhang, N.; Huang, H.; Duarte, M.; Zhang, J.
2016-06-01
Social media has developed extremely fast in metropolises in recent years resulting in more and more rumors disturbing our daily lives. Knowing the characteristics of rumor propagation in metropolises can help the government make efficient rumor refutation plans. In this paper, we established a dynamic spatio-temporal comprehensive risk assessment model for rumor propagation based on an improved 8-state ICSAR model (Ignorant, Information Carrier, Information Spreader, Advocate, Removal), large personal activity trajectory data, and governmental rumor refutation (anti-rumor) scenarios. Combining these relevant data with the 'big' traffic data on the use of subways, buses, and taxis, we simulated daily oral communications among inhabitants in Beijing. In order to analyze rumor and anti-rumor competition in the actual social network, personal resistance, personal preference, conformity, rumor intensity, government rumor refutation and other influencing factors were considered. Based on the developed risk assessment model, a long-term dynamic rumor propagation simulation for a seven day period was conducted and a comprehensive rumor propagation risk distribution map was obtained. A set of the sensitivity analyses were conducted for different social media and propagation routes. We assessed different anti-rumor coverage ratios and the rumor-spreading thresholds at which the government started to launch anti-rumor actions. The results we obtained provide worthwhile references useful for governmental decision making towards control of social-disrupting rumors.
SIMULATION MODELING OF GASTROINTESTINAL ABSORPTION
Mathematical dosimetry models incorporate mechanistic determinants of chemical disposition in a living organism to describe relationships between exposure concentration and the internal dose needed for PBPK models and human health risk assessment. Because they rely on determini...
Fachehoun, Richard Coovi; Lévesque, Benoit; Dumas, Pierre; St-Louis, Antoine; Dubé, Marjolaine; Ayotte, Pierre
2015-01-01
Game meat from animals killed by lead ammunition may expose consumers to lead. We assessed the risk related to lead intake from meat consumption of white-tailed deer and moose killed by lead ammunition and documented the perception of hunters and butchers regarding this potential contamination. Information on cervid meat consumption and risk perception were collected using a mailed self-administrated questionnaire which was addressed to a random sample of Quebec hunters. In parallel, 72 samples of white-tailed deer (n = 35) and moose (n = 37) meats were collected from voluntary hunters and analysed for lead content using inductively coupled plasma-mass spectrometry. A risk assessment for people consuming lead shot game meat was performed using Monte Carlo simulations. Mean lead levels in white-tailed deer and moose killed by lead ammunition were 0.28 and 0.17 mg kg(-1) respectively. Risk assessment based on declared cervid meat consumption revealed that 1.7% of the surveyed hunters would exceed the dose associated with a 1 mmHg increase in systolic blood pressure (SBP). For consumers of moose meat once, twice or three times a week, simulations predicted that 0.5%, 0.9% and 1.5% of adults would be exposed to a dose associated with a 1 mmHg increase in SBP, whereas 0.9%, 1.9% and 3.3% of children would be exposed to a dose associated with 1 point intelligence quotient (IQ) decrease, respectively. For consumers of deer meat once, twice or three times a week, the proportions were 1.6%, 2.9% and 4% for adults and 2.9%, 5.8% and 7.7% for children, respectively. The consumption of meat from cervids killed with lead ammunition may increase lead exposure and its associated health risks. It would be important to inform the population, particularly hunters, about this potential risk and promote the use of lead-free ammunition.
Douglas J. Shinneman; Brian J. Palik; Meredith W. Cornett
2012-01-01
Management strategies to restore forest landscapes are often designed to concurrently reduce fire risk. However, the compatibility of these two objectives is not always clear, and uncoordinated management among landowners may have unintended consequences. We used a forest landscape simulation model to compare the effects of contemporary management and hypothetical...
Hawken, Steven; Kwong, Jeffrey C.; Deeks, Shelley L.; Crowcroft, Natasha S.; McGeer, Allison J.; Ducharme, Robin; Campitelli, Michael A.; Coyle, Doug
2015-01-01
It is unclear whether seasonal influenza vaccination results in a net increase or decrease in the risk for Guillain-Barré syndrome (GBS). To assess the effect of seasonal influenza vaccination on the absolute risk of acquiring GBS, we used simulation models and published estimates of age- and sex-specific risks for GBS, influenza incidence, and vaccine effectiveness. For a hypothetical 45-year-old woman and 75-year-old man, excess GBS risk for influenza vaccination versus no vaccination was −0.36/1 million vaccinations (95% credible interval −1.22 to 0.28) and −0.42/1 million vaccinations (95% credible interval, –3.68 to 2.44), respectively. These numbers represent a small absolute reduction in GBS risk with vaccination. Under typical conditions (e.g. influenza incidence rates >5% and vaccine effectiveness >60%), vaccination reduced GBS risk. These findings should strengthen confidence in the safety of influenza vaccine and allow health professionals to better put GBS risk in context when discussing influenza vaccination with patients. PMID:25625590
Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.
Xin, Cao; Chongshi, Gu
2016-01-01
Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.
Okunribido, Olanrewaju O
2013-01-01
This article is a report of a study of the effect of the seat cushion on risk of falling from a wheelchair. Two laboratory studies and simulated assistant propelled wheelchair transfers were conducted with four healthy female participants. For the laboratory studies there were three independent variables: trunk posture (upright/flexed forward), seat cushion (flat polyurethane/propad low profile), and feet condition (dangling/supported), and two dependent variables: occupied wheelchair (wheelchair) center of gravity (CG), and stability. For the simulated transfers there was one independent variable: seat cushion (flat polyurethane/propad low profile), and one dependent variable: perception of safety (risk of falling). Results showed that the wheelchair CG was closer to the front wheels, and stability lower for the propad low profile cushion compared to the polyurethane cushion, when the participants sat with their feet dangling. During the simulated transfers, sitting on the propad low profile cushion caused participants to feel more apprehensive (anxious or uneasy) compared to sitting on the polyurethane cushion. The findings can contribute to the assessment of risk and care planning of non-ambulatory wheelchair users.
[Using sequential indicator simulation method to define risk areas of soil heavy metals in farmland.
Yang, Hao; Song, Ying Qiang; Hu, Yue Ming; Chen, Fei Xiang; Zhang, Rui
2018-05-01
The heavy metals in soil have serious impacts on safety, ecological environment and human health due to their toxicity and accumulation. It is necessary to efficiently identify the risk area of heavy metals in farmland soil, which is of important significance for environment protection, pollution warning and farmland risk control. We collected 204 samples and analyzed the contents of seven kinds of heavy metals (Cu, Zn, Pb, Cd, Cr, As, Hg) in Zengcheng District of Guangzhou, China. In order to overcame the problems of the data, including the limitation of abnormal values and skewness distribution and the smooth effect with the traditional kriging methods, we used sequential indicator simulation method (SISIM) to define the spatial distribution of heavy metals, and combined Hakanson index method to identify potential ecological risk area of heavy metals in farmland. The results showed that: (1) Based on the similar accuracy of spatial prediction of soil heavy metals, the SISIM had a better expression of detail rebuild than ordinary kriging in small scale area. Compared to indicator kriging, the SISIM had less error rate (4.9%-17.1%) in uncertainty evaluation of heavy-metal risk identification. The SISIM had less smooth effect and was more applicable to simulate the spatial uncertainty assessment of soil heavy metals and risk identification. (2) There was no pollution in Zengcheng's farmland. Moderate potential ecological risk was found in the southern part of study area due to enterprise production, human activities, and river sediments. This study combined the sequential indicator simulation with Hakanson risk index method, and effectively overcame the outlier information loss and smooth effect of traditional kriging method. It provided a new way to identify the soil heavy metal risk area of farmland in uneven sampling.
Franz, E; Tromp, S O; Rijgersberg, H; van der Fels-Klerx, H J
2010-02-01
Fresh vegetables are increasingly recognized as a source of foodborne outbreaks in many parts of the world. The purpose of this study was to conduct a quantitative microbial risk assessment for Escherichia coli O157:H7, Salmonella, and Listeria monocytogenes infection from consumption of leafy green vegetables in salad from salad bars in The Netherlands. Pathogen growth was modeled in Aladin (Agro Logistics Analysis and Design Instrument) using time-temperature profiles in the chilled supply chain and one particular restaurant with a salad bar. A second-order Monte Carlo risk assessment model was constructed (using @Risk) to estimate the public health effects. The temperature in the studied cold chain was well controlled below 5 degrees C. Growth of E. coli O157:H7 and Salmonella was minimal (17 and 15%, respectively). Growth of L. monocytogenes was considerably greater (194%). Based on first-order Monte Carlo simulations, the average number of cases per year in The Netherlands associated the consumption leafy greens in salads from salad bars was 166, 187, and 0.3 for E. coli O157:H7, Salmonella, and L. monocytogenes, respectively. The ranges of the average number of annual cases as estimated by second-order Monte Carlo simulation (with prevalence and number of visitors as uncertain variables) were 42 to 551 for E. coli O157:H7, 81 to 281 for Salmonella, and 0.1 to 0.9 for L. monocytogenes. This study included an integration of modeling pathogen growth in the supply chain of fresh leafy vegetables destined for restaurant salad bars using software designed to model and design logistics and modeling the public health effects using probabilistic risk assessment software.
UAV Swarm Operational Risk Assessment System
2015-09-01
a SIPRNET connection. For practicality in development of this prototype, the interface was created using the MATLAB GUI language . By design, the use ...and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE September 2015 3...discrete-event simulation of UAV swarm attacks using ExtendSim, statistical analysis of the simulation data using Minitab, and a graphical user interface
Spatial and Temporal Flood Risk Assessment for Decision Making Approach
NASA Astrophysics Data System (ADS)
Azizat, Nazirah; Omar, Wan-Mohd-Sabki Wan
2018-03-01
Heavy rainfall, adversely impacting inundation areas, depends on the magnitude of the flood. Significantly, location of settlements, infrastructure and facilities in floodplains result in many regions facing flooding risks. A problem faced by the decision maker in an assessment of flood vulnerability and evaluation of adaptation measures is recurrent flooding in the same areas. Identification of recurrent flooding areas and frequency of floods should be priorities for flood risk management. However, spatial and temporal variability become major factors of uncertainty in flood risk management. Therefore, dynamic and spatial characteristics of these changes in flood impact assessment are important in making decisions about the future of infrastructure development and community life. System dynamics (SD) simulation and hydrodynamic modelling are presented as tools for modelling the dynamic characteristics of flood risk and spatial variability. This paper discusses the integration between spatial and temporal information that is required by the decision maker for the identification of multi-criteria decision problems involving multiple stakeholders.
[A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].
Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang
2015-05-01
To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.
Pérez-Maldonado, Iván N; Ochoa Martínez, Ángeles C; Ruíz-Vera, Tania; Orta-García, Sandra T; Varela-Silva, José A
2017-09-01
Recent studies have documented environmental contamination by PCBs in soil from different areas in Mexico (industrial, mining, and urban sites). However, the real significance of that soil contamination has not been established. Therefore, the aim of this study was to perform a human health risk assessment (Monte Carlos simulation) to evaluate the probable toxic effects of soils contaminated with PCBs on children in four sites in Mexico. A high non-carcinogenic risk (total nHQ = 1.1E+01; if nHQ ≥1, hazardous health effects cannot be ruled out) was found in Alpuyeca, Morelos, Mexico. Moreover, the total CR (cancer risk) found in Alpuyeca, Morelos is of concern (total CR = 5.1E-03), being that a cut-point of 1.0E-06 has been suggested as a safe level for cancer risk. Taking into consideration the data shown in this research, we conclude that a strategy to protect human health is necessary for the assessed sites.
Tian, Hua; Wang, Xueying; Shu, Gequn; Wu, Mingqiang; Yan, Nanhua; Ma, Xiaonan
2017-09-15
Mixture of hydrocarbon and carbon dioxide shows excellent cycle performance in Organic Rankine Cycle (ORC) used for engine waste heat recovery, but the unavoidable leakage in practical application is a threat for safety due to its flammability. In this work, a quantitative risk assessment system (QR-AS) is established aiming at providing a general method of risk assessment for flammable working fluid leakage. The QR-AS covers three main aspects: analysis of concentration distribution based on CFD simulations, explosive risk assessment based on the TNT equivalent method and risk mitigation based on evaluation results. A typical case of propane/carbon dioxide mixture leaking from ORC is investigated to illustrate the application of QR-AS. According to the assessment results, proper ventilation speed, safe mixture ratio and location of gas-detecting devices have been proposed to guarantee the security in case of leakage. The results revealed that this presented QR-AS was reliable for the practical application and the evaluation results could provide valuable guidance for the design of mitigation measures to improve the safe performance of ORC system. Copyright © 2017 Elsevier B.V. All rights reserved.
CFD-based Thrombotic Risk Assessment in Kawasaki Disease Patients with Coronary Artery Aneurysms
NASA Astrophysics Data System (ADS)
Sengupta, Dibyendu; Kung, Ethan; Kahn, Andrew; Burns, Jane; Marsden, Alison
2012-11-01
Coronary aneurysms occur in 25% of untreated Kawasaki Disease (KD) patients and put patients at increased risk for myocardial infarction and sudden death. Clinical guidelines recommend using aneurysm diameter >8 mm as the arbitrary criterion for treating with anti-coagulation therapy. This study uses patient-specific modeling to non-invasively determine hemodynamic parameters and quantify thrombotic risk. Anatomic models were constructed from CT angiographic image data from 5 KD aneurysm patients and one normal control. CFD simulations were performed to obtain hemodynamic data including WSS and particle residence times (PRT). Thrombosis was clinically observed in 4/9 aneurysmal coronaries. Thrombosed vessels required twice as many cardiac cycles (mean 8.2 vs. 4.2) for particles to exit, and had lower mean WSS (1.3 compared to 2.8 dynes/cm2) compared to vessels with non-thrombosed aneurysms of similar max diameter. 1 KD patient in the cohort with acute thrombosis had diameter < 8 mm. Regions of low WSS and high PRT predicted by simulations correlated with regions of subsequent thrombus formation. Thrombotic risk stratification for KD aneurysms may be improved by incorporating both hemodynamic and geometric quantities. Current clinical guidelines to assess patient risk based only on aneurysm diameter may be misleading. Further prospective study is warranted to evaluate the utility of patient-specific modeling in risk stratifying KD patients with coronary aneurysms. NIH R21.
Environmental Risk Assessment of Nanomaterials
NASA Astrophysics Data System (ADS)
Bayramov, A. A.
In this paper, various aspects of modern nanotechnologies and, as a result, risks of nanomaterials impact on an environment are considered. This very brief review of the First International Conference on Material and Information Sciences in High Technologies (2007, Baku, Azerbaijan) is given. The conference presented many reports that were devoted to nanotechnology in biology and business for the developing World, formation of charged nanoparticles for creation of functional nanostructures, nanoprocessing of carbon nanotubes, magnetic and optical properties of manganese-phosphorus nanowires, ultra-nanocrystalline diamond films, and nanophotonics communications in Azerbaijan. The mathematical methods of simulation of the group, individual and social risks are considered for the purpose of nanomaterials risk reduction and remediation. Lastly, we have conducted studies at a plant of polymeric materials (and nanomaterials), located near Baku. Assessments have been conducted on the individual risk of person affection and constructed the map of equal isolines and zones of individual risk for a plant of polymeric materials (and nanomaterials).
Nakayama, Yumiko; Kishida, Fumio; Nakatsuka, Iwao; Matsuo, Masatoshi
2005-01-01
The toxicokinetics/toxicodynamics (TKTD) model simulates the toxicokinetics of a chemical based on physiological data such as blood flow, tissue partition coefficients and metabolism. In this study, Andersen and Clewell's TKTD model was used with seven compartments and ten differential equations for calculating chemical balances in the compartments (Andersen and Clewell 1996, Workshop on physiologically-based pharmacokinetic/pharmacodynamic modeling and risk assessment, Aug. 5-16 at Colorado State University, U.S.A) . Using this model, the authors attempted to simulate the behavior of four chemicals: trichloroethylene, methylene chloride, styrene and n-hexane, and the results were evaluated. Simulations of the behavior of trichloroethylene taken in via inhalation and oral exposure routes were also done. The differences between simulations and measurements are due to the differences between the absorption rates of the exposure routes. By changing the absorption rates, the simulation showed agreement with the measured values. The simulations of the other three chemicals showed good results. Thus, this model is useful for simulating the behavior of chemicals for preliminary toxicity assessment.
NASA Astrophysics Data System (ADS)
Feng, Yongliang; Chen, Yanzhen; Wang, Jing; Gong, Yufeng; Liu, Xigang; Mu, Gang; Tian, Hua
2016-11-01
At present, the methods widely applied to assess ecological risk of heavy metals are essentially single-point estimates in which exposure and toxicity data cannot be fully used and probabilities of adverse biological eff ects cannot be achieved. In this study, based on investigation of concentrations of six heavy metals (As, Hg, Pb, Cd, Cu, and Zn) in the surface seawater and sediment near the outlet of a zinc factory, located in Huludao City, Liaoning Province, China, a tiered approach consisting of several probabilistic options was used to refine ecological risk assessment for the individuals. A mixture of various heavy metals was detected in the surface seawater, and potential ecological risk index (PERI) was adopted to assess the potential ecological risk of heavy metals in the surface sediment. The results from all levels of aquatic ecological risk assessment in the tiered framework, ranging from comparison of single eff ects and exposure values to the use of distribution-based Hazard Quotient obtained through Monte Carlo simulation, are consistent with each other. Briefly, aquatic Zn and Cu posed a clear ecological risk, while Cd, Pb, Hg, and As in the water column posed potential risk. As expected, combined ecological risk of heavy metal mixture in the surface seawater was proved significantly higher than the risk caused by any individual heavy metal, calculated using the concept of total equivalent concentration. According to PERI, the severity of pollution by the six heavy metals in the surface sediment decreased in the following sequence: Cd>Hg>As>Pb>Cu>Zn, and the total heavy metals in the sediment posed a very high risk to the marine environment. This study provides a useful mathematical framework for ecological risk assessment of heavy metals.
NASA Astrophysics Data System (ADS)
Chivukula, V. Keshav; McGah, Patrick; Prisco, Anthony; Beckman, Jennifer; Mokadam, Nanush; Mahr, Claudius; Aliseda, Alberto
2016-11-01
Flow in the aortic vasculature may impact stroke risk in patients with left ventricular assist devices (LVAD) due to severely altered hemodynamics. Patient-specific 3D models of the aortic arch and great vessels were created with an LVAD outflow graft at 45, 60 and 90° from centerline of the ascending aorta, in order to understand the effect of surgical placement on hemodynamics and thrombotic risk. Intermittent aortic valve opening (once every five cardiac cycles) was simulated and the impact of this residual native output investigated for the potential to wash out stagnant flow in the aortic root region. Unsteady CFD simulations with patient-specific boundary conditions were performed. Particle tracking for 10 cardiac cycles was used to determine platelet residence times and shear stress histories. Thrombosis risk was assessed by a combination of Eulerian and Lagrangian metrics and a newly developed thrombogenic potential metric. Results show a strong influence of LVAD outflow graft angle on hemodynamics in the ascending aorta and consequently on stroke risk, with a highly positive impact of aortic valve opening, even at low frequencies. Optimization of LVAD implantation and management strategies based on patient-specific simulations to minimize stroke risk will be presented
Hydraulic risk assessment of bridges using UAV photogrammetry
NASA Astrophysics Data System (ADS)
Hackl, Jürgen; Adey, Bryan T.; Woźniak, Michał; Schümperlin, Oliver
2017-04-01
Road networks are essential for economic growth and development. Of the objects within a road network, bridges are of special interest, because their failure often results in relatively large interruptions to how the network is used, their replacement costs are generally large, and it usually takes a considerable amount of time to restore them once they have failed. Of the different types of bridges, bridges in mountainous regions are of special interest because their failure could cause severe societal consequences, for example, if it renders an area inaccessible. One of the main causes of the failure of bridges in mountainous regions is the occurrence of a hydraulic event, for example, flood waters above a certain level, scour below a certain depth or debris build up beyond a certain level. An assessment of risk related to a bridge in a mountainous region is challenging. The probability of occurrence of these events, and the resulting consequences, depend greatly on the characteristics (e.g. slope, soil, vegetation, precipitation, …) of the specific regions where the bridges are located. An indication of the effect of these characteristics can be seen in the sediment deposition during floods in mountain catchments. Additionally, there is often no, or no recent, topological information that can be used to develop terrain models to be used for realistic water flow simulations in mountain regions, and most hydrology and hydraulic models have been developed for lower gradient rivers and can often not be directly used to model water flow in mountain rivers. In an effort to improve the assessment of risk related to bridges in mountainous regions, using the setting for risk assessments established by Hackl et al. (2015) and Adey et al. (2016), an investigation was undertaken to determine whether unmanned aerial vehicles (UAVs) and photogrammetry could be used to generate the topological information required to run realistic water flow simulations. The process investigated includes: the use of geo-referenced images, taken by an UAV, the exportation of these images into a photogrammetric software, the creation of a 3D mesh of the terrain from these images, the conversion of the 3D mesh to a computational mesh, the use of the computational mesh to build a hydrodynamic model, and the use of the hydrodynamic model to run flow simulations. The process was used to estimate the complex water flow near a single span concrete bridge in the Canton of Grisons, Switzerland. The hydraulic events (abutment scour and overflow) predicted by the developed model were compared with with historical observations from a recent flood event in the region. The hydraulic events predicted by the developed model correspond with historical observations, indicating that the topological information collected in this way is sufficiently accurate to be used to simulate complex flow situations, which can be used in bridge risk assessments. Hackl, J., Adey, B.T., Heitzler, M., and Iosifescu Enescu, I. (2015). "An Overarching Risk Assessment Process to Evaluate the Risks Associated with Infrastructure Networks due to Natural Hazards." International Journal of Performability Engineering, 11(2), 153-168. Adey, B.T., Hackl, J., Lam, J.C., van Gelder, P., Prak, P., van Erp, N., Heitzler, M., Iosifescu Enescu, I., and Hurni, L. (2016). "Ensuring acceptable levels of infrastructure related risks due to natural hazards with emphasis on conducting stress tests." 1st International Symposium on Infrastructure Asset Management (SIAM2016), K. Kobayashi, ed., Kyoto, Japan, 19-29 (Jan).
A polygon-based modeling approach to assess exposure of resources and assets to wildfire
Matthew P. Thompson; Joe Scott; Jeffrey D. Kaiden; Julie W. Gilbertson-Day
2013-01-01
Spatially explicit burn probability modeling is increasingly applied to assess wildfire risk and inform mitigation strategy development. Burn probabilities are typically expressed on a per-pixel basis, calculated as the number of times a pixel burns divided by the number of simulation iterations. Spatial intersection of highly valued resources and assets (HVRAs) with...
The sediment-contaminant transport model SERATRA was used as an integral part of the Chemical Migration and Risk Assessment (CMRA) Methodology, which simulates migration and fate of a contaminant over the land surface and in receiving streams, to assess potential short- and long-...
J. H. Scott; D. J. Helmbrecht; M. P. Thompson
2014-01-01
Characterizing wildfire risk to a fire-adapted ecosystem presents particular challenges due to its broad spatial extent, inherent complexity, and the difficulty in defining wildfire-induced losses and benefits. Our approach couples stochastic wildfire simulation with a vegetation condition assessment framework to estimate the conditional and expected response of...
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-12
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
NASA Astrophysics Data System (ADS)
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-01
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
Milhorn, Denise; Korpi-Steiner, Nichole
2015-02-01
It is unclear if the point-of-care (POC) Clinitest hCG device is subject to high-dose hook interference from physiological concentrations of intact human chorionic gonadotropin (hCG), β-core fragment of hCG (hCGβcf), and hCG free β-subunit (hCGβ) found in urine during pregnancy. We used a simulation model to address this question and related our findings to our institution's pregnant population in order to assess risk for potential false-negative hCG results. The expected distribution of days relative to ovulation during routine POC hCG testing was estimated from 182 patients. Clinitest-Clinitek Status hCG device susceptibility to high-dose hook interference from hCG variants and potential risk of false-negative results as it relates to this population were evaluated by testing increasing concentrations of hCG, hCGβcf, hCGβ as well as urine simulating physiological hCG, hCGβcf and hCGβ concentrations expected during early pregnancy (≤44 days post-ovulation). The Clinitest-Clinitek Status hCG device exhibited high-dose hook interference from hCGβcf alone, but not from hCG, hCGβ, or simulated physiological urinary concentrations of combined hCG, hCGβcf and hCGβ expected during early pregnancy. The majority of our patient population had urinary hCG testing conducted during early pregnancy. The Clinitest-Clinitek Status hCG device is unlikely to exhibit false-negative urinary hCG results due to high-dose hook interference for women in early healthy pregnancy, although additional studies are necessary to determine potential risk in other patient populations. Visual interpretation of POC urinary hCG device results is an important failure mode to consider in risk analyses for erroneous urinary hCG device results. Published by Elsevier Inc.
Simulation of floods caused by overloaded sewer systems: extensions of shallow-water equations
NASA Astrophysics Data System (ADS)
Hilden, Michael
2005-03-01
The outflow of water from a manhole onto a street is a typical flow problem within the simulation of floods in urban areas that are caused by overloaded sewer systems in the event of heavy rains. The reliable assessment of the flood risk for the connected houses requires accurate simulations of the water flow processes in the sewer system and in the street.The Navier-Stokes equations (NSEs) describe the free surface flow of the fluid water accurately, but since their numerical solution requires high CPU times and much memory, their application is not practical. However, their solutions for selected flow problems are applied as reference states to assess the results of other model approaches.The classical shallow-water equations (SWEs) require only fractions (factor 1/100) of the NSEs' computational effort. They assume hydrostatic pressure distribution, depth-averaged horizontal velocities and neglect vertical velocities. These shallow-water assumptions are not fulfilled for the outflow of water from a manhole onto the street. Accordingly, calculations show differences between NSEs and SWEs solutions.The SWEs are extended in order to assess the flood risks in urban areas reliably within applicable computational efforts. Separating vortex regions from the main flow and approximating vertical velocities to involve their contributions into a pressure correction yield suitable results.
Optimisation of Critical Infrastructure Protection: The SiVe Project on Airport Security
NASA Astrophysics Data System (ADS)
Breiing, Marcus; Cole, Mara; D'Avanzo, John; Geiger, Gebhard; Goldner, Sascha; Kuhlmann, Andreas; Lorenz, Claudia; Papproth, Alf; Petzel, Erhard; Schwetje, Oliver
This paper outlines the scientific goals, ongoing work and first results of the SiVe research project on critical infrastructure security. The methodology is generic while pilot studies are chosen from airport security. The outline proceeds in three major steps, (1) building a threat scenario, (2) development of simulation models as scenario refinements, and (3) assessment of alternatives. Advanced techniques of systems analysis and simulation are employed to model relevant airport structures and processes as well as offences. Computer experiments are carried out to compare and optimise alternative solutions. The optimality analyses draw on approaches to quantitative risk assessment recently developed in the operational sciences. To exploit the advantages of the various techniques, an integrated simulation workbench is build up in the project.
SIMULATING ATMOSPHERIC EXPOSURE USING AN INNOVATIVE METEOROLOGICAL SAMPLING SCHEME
Multimedia Risk assessments require the temporal integration of atmospheric concentration and deposition estimates with other media modules. However, providing an extended time series of estimates is computationally expensive. An alternative approach is to substitute long-ter...
DEMOGRAPHIC UNCERTAINTY IN ECOLOGICAL RISK ASSESSMENTS. (R825347)
We built a Ricker's model incorporating demographic stochasticity to simulate the effects of demographic uncertainty on responses of gray-tailed vole (Microtus canicaudus) populations to pesticide applications. We constructed models with mark-recapture data collected from populat...
TERRESTRIAL ECOSYSTEM SIMULATOR
The Terrestrial Habitats Project at the Western Ecology Division (Corvallis, OR) is developing tools and databases to meet the needs of Program Office clients for assessing risks to wildlife and terrestrial ecosystems. Because habitat is a dynamic condition in real-world environm...
NASA Astrophysics Data System (ADS)
Chen, Tzikang J.; Shiao, Michael
2016-04-01
This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.
NASA Astrophysics Data System (ADS)
Babendreier, J. E.
2002-05-01
Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described, along with the conceptual layout of an accompanying java-based paralleling software toolset. Preliminary work is also reported for a scenario involving Benzene disposal that describes the relative importance of the vadose zone in driving risk levels for ecological receptors and human health. Incorporating landfills, waste piles, aerated tanks, surface impoundments, and land application units, the site-based data used in the analysis included 201 national facilities representing 419 site-WMU combinations.
NASA Astrophysics Data System (ADS)
Cvetkovic, V.; Molin, S.
2012-02-01
We present a methodology that combines numerical simulations of groundwater flow and advective transport in heterogeneous porous media with analytical retention models for computing the infection risk probability from pathogens in aquifers. The methodology is based on the analytical results presented in [1,2] for utilising the colloid filtration theory in a time-domain random walk framework. It is shown that in uniform flow, the results from the numerical simulations of advection yield comparable results as the analytical TDRW model for generating advection segments. It is shown that spatial variability of the attachment rate may be significant, however, it appears to affect risk in a different manner depending on if the flow is uniform or radially converging. In spite of the fact that numerous issues remain open regarding pathogen transport in aquifers on the field scale, the methodology presented here may be useful for screening purposes, and may also serve as a basis for future studies that would include greater complexity.
Cancer risk of polycyclic aromatic hydrocarbons (PAHs) in the soils from Jiaozhou Bay wetland.
Yang, Wei; Lang, Yinhai; Li, Guoliang
2014-10-01
To estimate the cancer risk exposed to the PAHs in Jiaozhou Bay wetland soils, a probabilistic health risk assessment was conducted based on Monte Carlo simulations. A sensitivity analysis was performed to determine the input variables that contribute most to the cancer risk assessment. Three age groups were selected to estimate the cancer risk via four exposure pathways (soil ingestion, food ingestion, dermal contact and inhalation). The results revealed that the 95th percentiles cancer risks for children, teens and adults were 9.11×10(-6), 1.04×10(-5) and 7.08×10(-5), respectively. The cancer risks for three age groups were at acceptable range (10(-6)-10(-4)), indicating no potential cancer risk. For different exposure pathways, food ingestion was the major exposure pathway. For 7 carcinogenic PAHs, the cancer risk caused by BaP was the highest. Sensitivity analysis demonstrated that the parameters of exposure duration (ED) and sum of converted 7 carcinogenic PAHs concentrations in soil based on BaPeq (CSsoil) contribute most to the total uncertainty. This study provides a comprehensive risk assessment on carcinogenic PAHs in Jiaozhou Bay wetland soils, and might be useful in providing potential strategies of cancer risk prevention and controlling. Copyright © 2014 Elsevier Ltd. All rights reserved.
Application of geostatistics to risk assessment.
Thayer, William C; Griffith, Daniel A; Goodrum, Philip E; Diamond, Gary L; Hassett, James M
2003-10-01
Geostatistics offers two fundamental contributions to environmental contaminant exposure assessment: (1) a group of methods to quantitatively describe the spatial distribution of a pollutant and (2) the ability to improve estimates of the exposure point concentration by exploiting the geospatial information present in the data. The second contribution is particularly valuable when exposure estimates must be derived from small data sets, which is often the case in environmental risk assessment. This article addresses two topics related to the use of geostatistics in human and ecological risk assessments performed at hazardous waste sites: (1) the importance of assessing model assumptions when using geostatistics and (2) the use of geostatistics to improve estimates of the exposure point concentration (EPC) in the limited data scenario. The latter topic is approached here by comparing design-based estimators that are familiar to environmental risk assessors (e.g., Land's method) with geostatistics, a model-based estimator. In this report, we summarize the basics of spatial weighting of sample data, kriging, and geostatistical simulation. We then explore the two topics identified above in a case study, using soil lead concentration data from a Superfund site (a skeet and trap range). We also describe several areas where research is needed to advance the use of geostatistics in environmental risk assessment.
An integrated model-based approach to the risk assessment of pesticide drift from vineyards
NASA Astrophysics Data System (ADS)
Pivato, Alberto; Barausse, Alberto; Zecchinato, Francesco; Palmeri, Luca; Raga, Roberto; Lavagnolo, Maria Cristina; Cossu, Raffaello
2015-06-01
The inhalation of pesticide in air is of particular concern for people living in close contact with intensive agricultural activities. This study aims to develop an integrated modelling methodology to assess whether pesticides pose a risk to the health of people living near vineyards, and apply this methodology in the world-renowned Prosecco DOCG (Italian label for protection of origin and geographical indication of wines) region. A sample field in Bigolino di Valdobbiadene (North-Eastern Italy) was selected to perform the pesticide fate modellization and the consequent inhalation risk assessment for people living in the area. The modellization accounts for the direct pesticide loss during the treatment of vineyards and for the volatilization from soil after the end of the treatment. A fugacity model was used to assess the volatilization flux from soil. The Gaussian puff air dispersion model CALPUFF was employed to assess the airborne concentration of the emitted pesticide over the simulation domain. The subsequent risk assessment integrates the HArmonised environmental Indicators for pesticide Risk (HAIR) and US-EPA guidelines. In this case study the modelled situation turned to be safe from the point of view of human health in the case of non-carcinogenic compounds, and additional improvements were suggested to further mitigate the effect of the most critical compound.
Harwell, Mark A.; Gentile, John H.; Johnson, Charles B.; Garshelis, David L.; Parker, Keith R.
2010-01-01
A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001–2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194
Sahoo, Debasis; Robbe, Cyril; Deck, Caroline; Meyer, Frank; Papy, Alexandre; Willinger, Remy
2016-11-01
The main objective of this study is to develop a methodology to assess this risk based on experimental tests versus numerical predictive head injury simulations. A total of 16 non-lethal projectiles (NLP) impacts were conducted with rigid force plate at three different ranges of impact velocity (120, 72 and 55m/s) and the force/deformation-time data were used for the validation of finite element (FE) NLP. A good accordance between experimental and simulation data were obtained during validation of FE NLP with high correlation value (>0.98) and peak force discrepancy of less than 3%. A state-of-the art finite element head model with enhanced brain and skull material laws and specific head injury criteria was used for numerical computation of NLP impacts. Frontal and lateral FE NLP impacts to the head model at different velocities were performed under LS-DYNA. It is the very first time that the lethality of NLP is assessed by axonal strain computation to predict diffuse axonal injury (DAI) in NLP impacts to head. In case of temporo-parietal impact the min-max risk of DAI is 0-86%. With a velocity above 99.2m/s there is greater than 50% risk of DAI for temporo-parietal impacts. All the medium- and high-velocity impacts are susceptible to skull fracture, with a percentage risk higher than 90%. This study provides tool for a realistic injury (DAI and skull fracture) assessment during NLP impacts to the human head. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Anuj; Castleton, Karl J.; Hoopes, Bonnie L.
2004-06-01
The study of the release and effects of chemicals in the environment and their associated risks to humans is central to public and private decision making. FRAMES 1.X, Framework for Risk Analysis in Multimedia Environmental Systems, is a systems modeling software platform, developed by PNNL, Pacific Northwest National Laboratory, that helps scientists study the release and effects of chemicals on a source to outcome basis, create environmental models for similar risk assessment and management problems. The unique aspect of FRAMES is to dynamically introduce software modules representing individual components of a risk assessment (e.g., source release of contaminants, fate andmore » transport in various environmental media, exposure, etc.) within a software framework, manipulate their attributes and run simulations to obtain results. This paper outlines the fundamental constituents of FRAMES 2.X, an enhanced version of FRAMES 1.X, that greatly improve the ability of the module developers to “plug” their self-developed software modules into the system. The basic design, the underlying principles and a discussion of the guidelines for module developers are presented.« less
Risks from Solar Particle Events for Long Duration Space Missions Outside Low Earth Orbit
NASA Technical Reports Server (NTRS)
Over, S.; Myers, J.; Ford, J.
2016-01-01
The Integrated Medical Model (IMM) simulates the medical occurrences and mission outcomes for various mission profiles using probabilistic risk assessment techniques. As part of the work with the Integrated Medical Model (IMM), this project focuses on radiation risks from acute events during extended human missions outside low Earth orbit (LEO). Of primary importance in acute risk assessment are solar particle events (SPEs), which are low probability, high consequence events that could adversely affect mission outcomes through acute radiation damage to astronauts. SPEs can be further classified into coronal mass ejections (CMEs) and solar flares/impulsive events (Fig. 1). CMEs are an eruption of solar material and have shock enhancements that contribute to make these types of events higher in total fluence than impulsive events.
Human Health Risk Assessment Simulations in a Distributed Environment for Shuttle Launch
NASA Technical Reports Server (NTRS)
Thirumalainambi, Rajkumar; Bardina, Jorge
2004-01-01
During the launch of a rocket under prevailing weather conditions, commanders at Cape Canaveral Air Force station evaluate the possibility of whether wind blown toxic emissions might reach civilian and military personnel in the near by area. In our model, we focused mainly on Hydrogen chloride (HCL), Nitrogen oxides (NOx) and Nitric acid (HNO3), which are non-carcinogenic chemicals as per United States Environmental Protection Agency (USEPA) classification. We have used the hazard quotient model to estimate the number of people at risk. It is based on the number of people with exposure above a reference exposure level that is unlikely to cause adverse health effects. The risk to the exposed population is calculated by multiplying the individual risk and the number in exposed population. The risk values are compared against the acceptable risk values and GO or NO-go situation is decided based on risk values for the Shuttle launch. The entire model is simulated over the web and different scenaria can be generated which allows management to choose an optimum decision.
Hammond, Davyda; Conlon, Kathryn; Barzyk, Timothy; Chahine, Teresa; Zartarian, Valerie; Schultz, Brad
2011-03-01
Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessment of environmental-pollution-related risks. Databases and mapping tools that supply community-level estimates of ambient concentrations of hazardous pollutants, risk, and potential health impacts can provide relevant information for communities to understand, identify, and prioritize potential exposures and risk from multiple sources. An assessment of existing databases and mapping tools was conducted as part of this study to explore the utility of publicly available databases, and three of these databases were selected for use in a community-level GIS mapping application. Queried data from the U.S. EPA's National-Scale Air Toxics Assessment, Air Quality System, and National Emissions Inventory were mapped at the appropriate spatial and temporal resolutions for identifying risks of exposure to air pollutants in two communities. The maps combine monitored and model-simulated pollutant and health risk estimates, along with local survey results, to assist communities with the identification of potential exposure sources and pollution hot spots. Findings from this case study analysis will provide information to advance the development of new tools to assist communities with environmental risk assessments and hazard prioritization. © 2010 Society for Risk Analysis.
Command Process Modeling & Risk Analysis
NASA Technical Reports Server (NTRS)
Meshkat, Leila
2011-01-01
Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.
Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.
2018-01-01
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804
Billoir, Elise; Denis, Jean-Baptiste; Cammeau, Natalie; Cornu, Marie; Zuliani, Veronique
2011-02-01
To assess the impact of the manufacturing process on the fate of Listeria monocytogenes, we built a generic probabilistic model intended to simulate the successive steps in the process. Contamination evolution was modeled in the appropriate units (breasts, dice, and then packaging units through the successive steps in the process). To calibrate the model, parameter values were estimated from industrial data, from the literature, and based on expert opinion. By means of simulations, the model was explored using a baseline calibration and alternative scenarios, in order to assess the impact of changes in the process and of accidental events. The results are reported as contamination distributions and as the probability that the product will be acceptable with regards to the European regulatory safety criterion. Our results are consistent with data provided by industrial partners and highlight that tumbling is a key step for the distribution of the contamination at the end of the process. Process chain models could provide an important added value for risk assessment models that basically consider only the outputs of the process in their risk mitigation strategies. Moreover, a model calibrated to correspond to a specific plant could be used to optimize surveillance. © 2010 Society for Risk Analysis.
Development of the AFRL Aircrew Perfomance and Protection Data Bank
2007-12-01
Growth model and statistical model of hypobaric chamber simulations. It offers a quick and readily accessible online DCS risk assessment tool for...are used for the DCS prediction instead of the original model. ADRAC is based on more than 20 years of hypobaric chamber studies using human...prediction based on the combined Bubble Growth model and statistical model of hypobaric chamber simulations was integrated into the Data Bank. It
Risk Assessment of Anthrax Threat Letters
2001-09-01
extent of the hazard. In the experiments, envelopes containing Bacillus globigii spores (a simulant for anthrax) were opened in a mock mail room/office...des spores de Bacillus globigii (une bactérie imitant l’agent de l’anthrax) ont été ouvertes dans un endroit simulant une salle de courrier ou un...provide guidance to first responders and other government departments. In this study (non-pathogenic) Bacillus globigii (BG) spore contaminated
Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions
NASA Astrophysics Data System (ADS)
De Risi, Raffaele; Goda, Katsuichiro
2017-08-01
Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.
Montz, Ellen; Layton, Tim; Busch, Alisa B.; Ellis, Randall P.; Rose, Sherri; McGuire, Thomas G.
2016-01-01
Under the Affordable Care Act, the risk-adjustment program is designed to compensate health plans for enrolling people with poorer health status so that plans compete on cost and quality rather than the avoidance of high-cost individuals. This study examined health plan incentives to limit covered services for mental health and substance use disorders under the risk-adjustment system used in the health insurance Marketplaces. Through a simulation of the program on a population constructed to reflect Marketplace enrollees, we analyzed the cost consequences for plans enrolling people with mental health and substance use disorders. Our assessment points to systematic underpayment to plans for people with these diagnoses. We document how Marketplace risk adjustment does not remove incentives for plans to limit coverage for services associated with mental health and substance use disorders. Adding mental health and substance use diagnoses used in Medicare Part D risk adjustment is one potential policy step toward addressing this problem in the Marketplaces. PMID:27269018
Moorthy, Krishna; Munz, Yaron; Adams, Sally; Pandey, Vikas; Darzi, Ara
2005-01-01
Background: High-risk organizations such as aviation rely on simulations for the training and assessment of technical and team performance. The aim of this study was to develop a simulated environment for surgical trainees using similar principles. Methods: A total of 27 surgical trainees carried out a simulated procedure in a Simulated Operating Theatre with a standardized OR team. Observation of OR events was carried out by an unobtrusive data collection system: clinical data recorder. Assessment of performance consisted of blinded rating of technical skills, a checklist of technical events, an assessment of communication, and a global rating of team skills by a human factors expert and trained surgical research fellows. The participants underwent a debriefing session, and the face validity of the simulated environment was evaluated. Results: While technical skills rating discriminated between surgeons according to experience (P = 0.002), there were no differences in terms of the checklist and team skills (P = 0.70). While all trainees were observed to gown/glove and handle sharps correctly, low scores were observed for some key features of communication with other team members. Low scores were obtained by the entire cohort for vigilance. Interobserver reliability was 0.90 and 0.89 for technical and team skills ratings. Conclusions: The simulated operating theatre could serve as an environment for the development of surgical competence among surgical trainees. Objective, structured, and multimodal assessment of performance during simulated procedures could serve as a basis for focused feedback during training of technical and team skills. PMID:16244534
REVIEW OF SIMULATION METHODS FOR SPATIALLY-EXPLICIT POPULATION-LEVEL RISK ASSESSMENT
Factors that significantly impact population dynamics, such as resource availability and exposure to stressors, frequently vary over space and thereby determine the heterogeneous spatial distributions of organisms. Considering this fact, the US Environmental Protection Agency's ...
Simulating Runoff from a Grid Based Mercury Model: Flow Comparisons
Several mercury cycling models, including general mass balance approaches, mixed-batch reactors in streams or lakes, or regional process-based models, exist to assess the ecological exposure risks associated with anthropogenically increased atmospheric mercury (Hg) deposition, so...
3-D SIMULATIONS OF AIRWAYS WITHIN HUMAN LUNGS
Information regarding the deposition patterns of inhaled particles has important application to the fields of toxicology and medicine. The former concerns the risk assessment of inhaled air pollutants (inhalation toxicology); the latter concerns the targeted delivery of inhaled ...
Simulating future residential property losses from wildfire in Flathead County, Montana: Chapter 1
Prato, Tony; Paveglio, Travis B; Barnett, Yan; Silverstein, Robin; Hardy, Michael; Keane, Robert; Loehman, Rachel A.; Clark, Anthony; Fagre, Daniel B.; Venn, Tyron; Stockmann, Keith
2014-01-01
Wildfire damages to private residences in the United States and elsewhere have increased as a result of expansion of the wildland-urban interface (WUI) and other factors. Understanding this unwelcome trend requires analytical frameworks that simulate how various interacting social, economic, and biophysical factors influence those damages. A methodological framework is developed for simulating expected residential property losses from wildfire [E(RLW)], which is a probabilistic monetary measure of wildfire risk to residential properties in the WUI. E(RLW) is simulated for Flathead County, Montana for five, 10-year subperiods covering the period 2010-2059, under various assumptions about future climate change, economic growth, land use policy, and forest management. Results show statistically significant increases in the spatial extent of WUI properties, the number of residential structures at risk from wildfire, and E(RLW) over the 50-year evaluation period for both the county and smaller subareas (i.e., neighborhoods and parcels). The E(RLW) simulation framework presented here advances the field of wildfire risk assessment by providing a finer-scale tool that incorporates a set of dynamic, interacting processes. The framework can be applied using other scenarios for climate change, economic growth, land use policy, and forest management, and in other areas.
Risk assessment of debris flow in Yushu seismic area in China: a perspective for the reconstruction
NASA Astrophysics Data System (ADS)
Lan, H. X.; Li, L. P.; Zhang, Y. S.; Gao, X.; Liu, H. J.
2013-11-01
The 14 April 2010 Ms = 7.1 Yushu Earthquake (YE) had caused severe damage in the Jiegu township, the residential centre of Yushu Tibetan Autonomous Prefecture, Qinghai Province, China. In view of the fragile geological conditions after YE, risk assessment of secondary geohazards becomes an important concern for the reconstruction. A quantitative methodology was developed to assess the risk of debris flow by taking into account important intensity information. Debris flow scenarios were simulated with respect to rainfall events with 10, 50 and 100 yr returning period, respectively. The possible economic loss and fatalities caused by damage to buildings were assessed both in the settlement area and in the low hazard settlement area regarding the simulated debris flow events. Three modelled building types were adopted, i.e. hollow brick wood (HBW), hollow brick concrete (HBC) and reinforced concrete (RC) buildings. The results suggest that HBC structure achieves a good balance for the cost-benefit relationship compared with HBW and RC structures and thus could be an optimal choice for most of the new residential buildings in the Jiegu township. The low hazard boundary presents significant risk reduction efficiency in the 100 yr returning debris flow event. In addition, the societal risk for the settlement area is unacceptable when the 100 yr returning event occurs but reduces to ALARP (as low as reasonably practicable) level as the low hazard area is considered. Therefore, the low hazard area was highly recommended to be taken into account in the reconstruction. Yet, the societal risk might indeed approach an unacceptable level if one considers that YE has inevitably increased the occurrence frequency of debris flow. The quantitative results should be treated as a perspective for the reconstruction rather than precise numbers of future losses, owing to the complexity of the problem and the deficiency of data.
[Updating the problems of human ecology and environmental health and the ways of solving them].
Rakhmanin, Iu A
2012-01-01
Displaying a variety of scientific areas studying the influence of the environment on human health, the state and modern issues of assessment of environmental quality, hygienic standardization of chemical and biological contamination, methodical support of sanitarian and health monitoring and risk assessment of pollution, environmental health, the need for improvement and harmonization with the international instruments of the legal and methodological framework for the protection of the human environment, of the development of a modern management system of her quality based on epidemiological methods for simulation, risk analysis, assessment of economic damage to the environment and health of the population, forming a new branch of medicine--medicine of environment.
Maljovec, D.; Liu, S.; Wang, B.; ...
2015-07-14
Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less
The dark side of photovoltaic — 3D simulation of glare assessing risk and discomfort
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Thomas; Wollert, Alexander
2015-04-15
Photovoltaic (PV) systems form an important force in the implementation of renewable energies, but as we all know, the force has always its dark side. Besides efficiency considerations and discussions about architectures of power distribution networks, the increasing numbers of installations of PV systems for implementing renewable energies have secondary effects. PV systems can generate glare due to optical reflections and hence might be a serious concern. On the one hand, glare could affect safety, e.g. regarding traffic. On the other hand, glare is a constant source of discomfort in vicinities of PV systems. Hence, assessment of glare is decisivemore » for the success of renewable energies near municipalities and traffic zones for the success of solar power. Several courts decided on the change of PV systems and even on their de-installation because of glare effects. Thus, location-based assessments are required to limit potential reflections and to avoid risks for public infrastructure or discomfort of residents. The question arises on how to calculate reflections accurately according to the environment's topography. Our approach is founded in a 3D-based simulation methodology to calculate and visualize reflections based on the geometry of the environment of PV systems. This computational model is implemented by an interactive tool for simulation and visualization. Hence, project planners receive flexible assistance for adjusting the parameters of solar panels amid the planning process and in particular before the installation of a PV system. - Highlights: • Solar panels cause glare that impacts neighborhoods and traffic infrastructures. • Glare might cause disability and discomfort. • 3D environment for the calculation of glare • Interactive tool to simulate and visualize reflections • Impact assessment of solar power plant farms.« less
Hydrocarbon characterization experiments in fully turbulent fires : results and data analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suo-Anttila, Jill Marie; Blanchat, Thomas K.
As the capabilities of numerical simulations increase, decision makers are increasingly relying upon simulations rather than experiments to assess risks across a wide variety of accident scenarios including fires. There are still, however, many aspects of fires that are either not well understood or are difficult to treat from first principles due to the computational expense. For a simulation to be truly predictive and to provide decision makers with information which can be reliably used for risk assessment the remaining physical processes must be studied and suitable models developed for the effects of the physics. The model for the fuelmore » evaporation rate in a liquid fuel pool fire is significant because in well-ventilated fires the evaporation rate largely controls the total heat release rate from the fire. This report describes a set of fuel regression rates experiments to provide data for the development and validation of models. The experiments were performed with fires in the fully turbulent scale range (> 1 m diameter) and with a number of hydrocarbon fuels ranging from lightly sooting to heavily sooting. The importance of spectral absorption in the liquid fuels and the vapor dome above the pool was investigated and the total heat flux to the pool surface was measured. The importance of convection within the liquid fuel was assessed by restricting large scale liquid motion in some tests. These data sets provide a sound, experimentally proven basis for assessing how much of the liquid fuel needs to be modeled to enable a predictive simulation of a fuel fire given the couplings between evaporation of fuel from the pool and the heat release from the fire which drives the evaporation.« less
Hydrocarbon characterization experiments in fully turbulent fires.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricks, Allen; Blanchat, Thomas K.
As the capabilities of numerical simulations increase, decision makers are increasingly relying upon simulations rather than experiments to assess risks across a wide variety of accident scenarios including fires. There are still, however, many aspects of fires that are either not well understood or are difficult to treat from first principles due to the computational expense. For a simulation to be truly predictive and to provide decision makers with information which can be reliably used for risk assessment the remaining physical processes must be studied and suitable models developed for the effects of the physics. The model for the fuelmore » evaporation rate in a liquid fuel pool fire is significant because in well-ventilated fires the evaporation rate largely controls the total heat release rate from the fire. A set of experiments are outlined in this report which will provide data for the development and validation of models for the fuel regression rates in liquid hydrocarbon fuel fires. The experiments will be performed on fires in the fully turbulent scale range (> 1 m diameter) and with a number of hydrocarbon fuels ranging from lightly sooting to heavily sooting. The importance of spectral absorption in the liquid fuels and the vapor dome above the pool will be investigated and the total heat flux to the pool surface will be measured. The importance of convection within the liquid fuel will be assessed by restricting large scale liquid motion in some tests. These data sets will provide a sound, experimentally proven basis for assessing how much of the liquid fuel needs to be modeled to enable a predictive simulation of a fuel fire given the couplings between evaporation of fuel from the pool and the heat release from the fire which drives the evaporation.« less
Bayesian joint modelling of benefit and risk in drug development.
Costa, Maria J; Drury, Thomas
2018-05-01
To gain regulatory approval, a new medicine must demonstrate that its benefits outweigh any potential risks, ie, that the benefit-risk balance is favourable towards the new medicine. For transparency and clarity of the decision, a structured and consistent approach to benefit-risk assessment that quantifies uncertainties and accounts for underlying dependencies is desirable. This paper proposes two approaches to benefit-risk evaluation, both based on the idea of joint modelling of mixed outcomes that are potentially dependent at the subject level. Using Bayesian inference, the two approaches offer interpretability and efficiency to enhance qualitative frameworks. Simulation studies show that accounting for correlation leads to a more accurate assessment of the strength of evidence to support benefit-risk profiles of interest. Several graphical approaches are proposed that can be used to communicate the benefit-risk balance to project teams. Finally, the two approaches are illustrated in a case study using real clinical trial data. Copyright © 2018 John Wiley & Sons, Ltd.
Effects of dexamphetamine with and without alcohol on simulated driving.
Simons, Ries; Martens, Marieke; Ramaekers, Jan; Krul, Arno; Klöpping-Ketelaars, Ineke; Skopp, Gisela
2012-08-01
In party circuits dexamphetamine is frequently used in combination with alcohol. It is hypothesized that co-administration of dexamphetamine to alcohol might reduce the sedative effects of alcohol, but may potentiate risk-taking behaviour. The study was aimed at assessing the effects of alcohol, dexamphetamine and the combination of both on simulated driving and cognitive performance. Eighteen subjects participated in a randomized, crossover, placebo-controlled study employing four conditions: 10 mg dexamphetamine, 0.8 g/kg alcohol, 10 mg dexamphetamine + 0.8 g/kg alcohol, and placebo. Fundamental driving skills and risk-taking behaviour were assessed in a driving simulator. Subjects also completed vigilance and divided attention tasks, and subjective ratings. Mean BAC levels during simulated driving were between 0.91‰ and 0.64‰. Subjects using alcohol showed a significantly larger mean standard deviation of lateral position and shorter accepted gap time and distance. Use of alcohol or dexamphetamine + alcohol was associated with a higher frequency of red light running and collisions than the dexamphetamine or placebo conditions. Performance of vigilance and divided attention tasks was significantly impaired in the alcohol condition and, to a lesser degree, in the dexamphetamine + alcohol condition. Single doses of 0.8 g/kg alcohol increased risk-taking behaviours and impaired tracking, attention and reaction time during a 3-h period after drinking when BACs declined from 0.9 to 0.2 mg/ml. The stimulatory effects of co-administration of dexamphetamine 10 mg were not sufficient to overcome the impairing effects of alcohol on skills related to driving.
Ceacareanu, Alice C.; Brown, Geoffrey W.; Moussa, Hoda A.; Wintrob, Zachary A. P.
2018-01-01
Objective: We aimed to estimate the metformin-associated lactic acidosis (MALA) risk by assessing retrospectively the renal clearance variability and applying a pharmacokinetic (PK) model of metformin clearance in a population diagnosed with acute myeloid leukemia (AML) and diabetes mellitus (DM). Methods: All adults with preexisting DM and newly diagnosed AML at Roswell Park Cancer Institute were reviewed (January 2003–December 2010, n = 78). Creatinine clearance (CrCl) and total body weight distributions were used in a two-compartment PK model adapted for multiple dosing and modified to account for actual intra- and inter-individual variability. Based on this renal function variability evidence, 1000 PK profiles were simulated for multiple metformin regimens with the resultant PK profiles being assessed for safe CrCl thresholds. Findings: Metformin 500 mg up to three times daily was safe for all simulated profiles with CrCl ≥25 mL/min. Furthermore, the estimated overall MALA risk was below 10%, remaining under 5% for 500 mg given once daily. CrCl ≥65.25 mL/min was safe for administration in any of the tested regimens (500 mg or 850 mg up to three times daily or 1000 mg up to twice daily). Conclusion: PK simulation-guided prescribing can maximize metformin's beneficial effects on cancer outcomes while minimizing MALA risk. PMID:29755998
A Monte Carlo risk assessment model for acrylamide formation in French fries.
Cummins, Enda; Butler, Francis; Gormley, Ronan; Brunton, Nigel
2009-10-01
The objective of this study is to estimate the likely human exposure to the group 2a carcinogen, acrylamide, from French fries by Irish consumers by developing a quantitative risk assessment model using Monte Carlo simulation techniques. Various stages in the French-fry-making process were modeled from initial potato harvest, storage, and processing procedures. The model was developed in Microsoft Excel with the @Risk add-on package. The model was run for 10,000 iterations using Latin hypercube sampling. The simulated mean acrylamide level in French fries was calculated to be 317 microg/kg. It was found that females are exposed to smaller levels of acrylamide than males (mean exposure of 0.20 microg/kg bw/day and 0.27 microg/kg bw/day, respectively). Although the carcinogenic potency of acrylamide is not well known, the simulated probability of exceeding the average chronic human dietary intake of 1 microg/kg bw/day (as suggested by WHO) was 0.054 and 0.029 for males and females, respectively. A sensitivity analysis highlighted the importance of the selection of appropriate cultivars with known low reducing sugar levels for French fry production. Strict control of cooking conditions (correlation coefficient of 0.42 and 0.35 for frying time and temperature, respectively) and blanching procedures (correlation coefficient -0.25) were also found to be important in ensuring minimal acrylamide formation.
Improving Project Management with Simulation and Completion Distribution Functions
NASA Technical Reports Server (NTRS)
Cates, Grant R.
2004-01-01
Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. NASA project stakeholders participated in determining and managing completion distribution functions produced from PAST. The first result was that project stakeholders improved project completion risk awareness. Secondly, using PAST, mitigation options were analyzed to improve project completion performance and reduce total project cost.
Using driving simulators to assess driving safety.
Boyle, Linda Ng; Lee, John D
2010-05-01
Changes in drivers, vehicles, and roadways pose substantial challenges to the transportation safety community. Crash records and naturalistic driving data are useful for examining the influence of past or existing technology on drivers, and the associations between risk factors and crashes. However, they are limited because causation cannot be established and technology not yet installed in production vehicles cannot be assessed. Driving simulators have become an increasingly widespread tool to understand evolving and novel technologies. The ability to manipulate independent variables in a randomized, controlled setting also provides the added benefit of identifying causal links. This paper introduces a special issue on simulator-based safety studies. The special issue comprises 25 papers that demonstrate the use of driving simulators to address pressing transportation safety problems and includes topics as diverse as neurological dysfunction, work zone design, and driver distraction. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Design Analysis Kit for Optimization and Terascale Applications 6.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-19
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less
Vehicle response-based track geometry assessment using multi-body simulation
NASA Astrophysics Data System (ADS)
Kraft, Sönke; Causse, Julien; Coudert, Frédéric
2018-02-01
The assessment of the geometry of railway tracks is an indispensable requirement for safe rail traffic. Defects which represent a risk for the safety of the train have to be identified and the necessary measures taken. According to current standards, amplitude thresholds are applied to the track geometry parameters measured by recording cars. This geometry-based assessment has proved its value but suffers from the low correlation between the geometry parameters and the vehicle reactions. Experience shows that some defects leading to critical vehicle reactions are underestimated by this approach. The use of vehicle responses in the track geometry assessment process allows identifying critical defects and improving the maintenance operations. This work presents a vehicle response-based assessment method using multi-body simulation. The choice of the relevant operation conditions and the estimation of the simulation uncertainty are outlined. The defects are identified from exceedances of track geometry and vehicle response parameters. They are then classified using clustering methods and the correlation with vehicle response is analysed. The use of vehicle responses allows the detection of critical defects which are not identified from geometry parameters.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Hooper, Russell
2016-11-01
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically, it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. This manual offers Consortium for Advanced Simulation of Light Water Reactors (LWRs) (CASL) partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and howmore » to apply Dakota to a simulation problem.« less
McDonald, Catherine C; Kandadai, Venk; Loeb, Helen; Seacrist, Thomas; Lee, Yi-Ching; Bonfiglio, Dana; Fisher, Donald L; Winston, Flaura K
Collisions at left turn intersections are among the most prevalent types of teen driver serious crashes, with inadequate surveillance as a key factor. Risk awareness perception training (RAPT) has shown effectiveness in improving hazard anticipation for latent hazards. The goal of this study was to determine if RAPT version 3 (RAPT-3) improved intersection turning behaviors among novice teen drivers when the hazards were not latent and frequent glancing to multiple locations at the intersection was needed. Teens aged 16-18 with ≤180 days of licensure were randomly assigned to: 1) an intervention group (n=18) that received RAPT-3 (Trained); or 2) a control group (n=19) that received no training (Untrained). Both groups completed RAPT-3 Baseline Assessment and the Trained group completed RAPT-3 Training and RAPT-3 Post Assessment. Training effects were evaluated on a driving simulator. Simulator ( gap selection errors and collisions ) and eye tracker ( traffic check errors) metrics from six left-turn stop sign controlled intersections in the Simulated Driving Assessment (SDA) were analyzed. The Trained group scored significantly higher in RAPT-3 Post Assessment than RAPT-3 Baseline Assessment (p< 0.0001). There were no significant differences in either traffic check and gap selection errors or collisions among Trained and Untrained teens in the SDA. Though Trained teens learned about hazard anticipation related to latent hazards, learning did not translate to performance differences in left-turn stop sign controlled intersections where the hazards were not latent. Our findings point to further research to better understand the challenges teens have with left turn intersections.
Violence prevention education program for psychiatric outpatient departments.
Feinstein, Robert E
2014-10-01
Approximately 40 % of psychiatrists and up to 64 % of psychiatric residents have been physically assaulted. Ranges of 72-96 % of psychiatric residents in various studies have been verbally threatened. As violence risk occurs in outpatient settings, our department developed a quality and safety curriculum designed to prepare psychiatric residents and staff to optimally respond to aggressive outpatients and violence threats or events. In 2011 and 2012, we offered an 8-part violence prevention performance improvement curriculum/program including (1) situational awareness/creating a safe environment; (2) violence de-escalation training; (3) violence risk assessment training, use of risk assessment tools, and medical record documentation; (4) violence safety discharge planning; (5) legal issues and violence; (6) "shots fired on campus" video/discussion; (7) "2011 violence threat simulation" video/discussion; and (8) violence threat simulation exercise. This program was offered to approximately 60 psychiatric residents/staff in each year. We obtained qualitative comments about the entire program and data from 2 years of post-event surveys on the usefulness of the "violence threat simulation exercise." The large majority of comments about program elements 1 to 7 were positive. In 2011 and 2012, respectively, 76 and 86 % of participants responded to a post-event survey of the violence threat simulation exercise; 90 and 88 % of participants, respectively, reported the simulation to be very helpful/somewhat helpful; and 86 and 82 % of participants, respectively, reported feeling much better/better prepared to deal with a violent event. Although some participants experienced anxiety, sleep disturbances, increase in work safety concerns, and/or traumatic memories, the majority reported no post-simulation symptoms (72 and 80 %, respectively). Although we are unable to demonstrate that this program effectively prevents violence, the overall positive response from participants encourages us to continue developing our quality and safety program and to offer our easily reproducible and modifiable curriculum to others.
"No-Go Considerations" for In Situ Simulation Safety.
Bajaj, Komal; Minors, Anjoinette; Walker, Katie; Meguerdichian, Michael; Patterson, Mary
2018-06-01
In situ simulation is the practice of simulation in the actual clinical environment and has demonstrated utility in the assessment of system processes, identification of latent safety threats, and improvement in teamwork and communication. Nonetheless, performing simulated events in a real patient care setting poses potential risks to patient and staff safety. One integral aspect of a comprehensive approach to ensure the safety of in situ simulation includes the identification and establishment of "no-go considerations," that is, key decision-making considerations under which in situ simulations should be canceled, postponed, moved to another area, or rescheduled. These considerations should be modified and adjusted to specific clinical units. This article provides a framework of key essentials in developing no-go considerations.
Mallaina, Pablo; Lionis, Christos; Rol, Hugo; Imperiali, Renzo; Burgess, Andrew; Nixon, Mark; Malvestiti, Franco Mondello
2013-04-18
Smoking is a major risk factor for cardiovascular disease (CVD). This multicenter, cross-sectional survey was designed to estimate the cardiovascular (CV) risk attributable to smoking using risk assessment tools, to better understand patient behaviors and characteristics related to smoking, and characterize physician practice patterns. 1,439 smokers were recruited from Europe during 2011. Smokers were ≥40 years old, smoked > 10 cigarettes/day and had recent measurements on blood pressure and lipids. CV risk was calculated using the SCORE system, Framingham risk equations, and Progetto CUORE model. The CV risk attributable to smoking was evaluated using a simulated control (hypothetical non-smoker) with identical characteristics as the enrolled smoker. Risks assessed included CV mortality, coronary heart disease (CHD), CVD and hard CHD. Demographics, comorbidities, primary reasons for consultation, behavior towards previous attempts to quit, and interest in smoking cessation was assessed. Dependence on nicotine was evaluated using the Fagerström Test for Nicotine Dependence. GP practice patterns were assessed through a questionnaire. The prediction models consistently demonstrated a high CV risk attributable to smoking. For instance, the SCORE model demonstrated that this study population of smokers have a 100% increased probability of death due to cardiovascular disease in the next 10-years compared to non-smokers. A considerable amount of patients would like to hear from their GP about the different alternatives available to support their quitting attempt. The findings of this study reinforce the importance of smoking as a significant predictor of long-term cardiovascular events. One of the best gains in health could be obtained by tackling the most important modifiable risk factors; these results suggest smoking is among the most important.
Zheng, Yi; Lin, Zhongrong; Li, Hao; Ge, Yan; Zhang, Wei; Ye, Youbin; Wang, Xuejun
2014-05-15
Urban stormwater runoff delivers a significant amount of polycyclic aromatic hydrocarbons (PAHs), mostly of atmospheric origin, to receiving water bodies. The PAH pollution of urban stormwater runoff poses serious risk to aquatic life and human health, but has been overlooked by environmental modeling and management. This study proposed a dynamic modeling approach for assessing the PAH pollution and its associated environmental risk. A variable time-step model was developed to simulate the continuous cycles of pollutant buildup and washoff. To reflect the complex interaction among different environmental media (i.e. atmosphere, dust and stormwater), the dependence of the pollution level on antecedent weather conditions was investigated and embodied in the model. Long-term simulations of the model can be efficiently performed, and probabilistic features of the pollution level and its risk can be easily determined. The applicability of this approach and its value to environmental management was demonstrated by a case study in Beijing, China. The results showed that Beijing's PAH pollution of road runoff is relatively severe, and its associated risk exhibits notable seasonal variation. The current sweeping practice is effective in mitigating the pollution, but the effectiveness is both weather-dependent and compound-dependent. The proposed modeling approach can help identify critical timing and major pollutants for monitoring, assessing and controlling efforts to be focused on. The approach is extendable to other urban areas, as well as to other contaminants with similar fate and transport as PAHs. Copyright © 2014 Elsevier B.V. All rights reserved.
Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators
NASA Astrophysics Data System (ADS)
Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis
Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.
Mixed-field GCR Simulations for Radiobiological Research Using Ground Based Accelerators
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis A.
2014-01-01
Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20% accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.
NASA Astrophysics Data System (ADS)
Hartmann, Andreas; Gleeson, Tom; Wada, Yoshihide; Wagener, Thorsten
2016-04-01
Karst develops through the dissolution of carbonate rock. Karst groundwater in Europe is a major source of fresh water contributing up to half of the total drinking water supply in some countries. Climate model projections suggest that in the next 100 years, karst regions will experience a strong increase in temperature and a serious decrease of precipitation - especially in the Mediterranean region. Previous work showed that the karstic preferential recharge processes result in enhanced recharge rates and future climate sensitivity. But as there is fast water flow form the surface to the aquifer, there is also an enhanced risk of groundwater contamination. In this study we will assess the contamination risk of karst aquifers over Europe and the Mediterranean using simulated transit time distributions. Using a new type of semi-distributed model that considers the spatial heterogeneity of the karst system by distribution functions we simulated a range of spatially variable pathways of karstic groundwater recharge. The model is driven by the bias-corrected 5 GCMs of the ISI-MIP project (RCP8.5). Transit time distributions are calculated by virtual tracer experiments. These are repeated several times in the present (1991-2010) and the future (2080-2099). We can show that regions with larger fractions of preferential recharge show higher risks of contamination and that spatial patterns of contamination risk change towards the future.
NASA Technical Reports Server (NTRS)
Jung, Jaewoo; D'Souza, Sarah N.; Johnson, Marcus A.; Ishihara, Abraham K.; Modi, Hemil C.; Nikaido, Ben; Hasseeb, Hashmatullah
2016-01-01
In anticipation of a rapid increase in the number of civil Unmanned Aircraft System(UAS) operations, NASA is researching prototype technologies for a UAS Traffic Management (UTM) system that will investigate airspace integration requirements for enabling safe, efficient low-altitude operations. One aspect a UTM system must consider is the correlation between UAS operations (such as vehicles, operation areas and durations), UAS performance requirements, and the risk to people and property in the operational area. This paper investigates the potential application of the International Civil Aviation Organizations (ICAO) Required Navigation Performance (RNP) concept to relate operational risk with trajectory conformance requirements. The approach is to first define a method to quantify operational risk and then define the RNP level requirement as a function of the operational risk. Greater operational risk corresponds to more accurate RNP level, or smaller tolerable Total System Error (TSE). Data from 19 small UAS flights are used to develop and validate a formula that defines this relationship. An approach to assessing UAS-RNP conformance capability using vehicle modeling and wind field simulation is developed to investigate how this formula may be applied in a future UTM system. The results indicate the modeled vehicles flight path is robust to the simulated wind variation, and it can meet RNP level requirements calculated by the formula. The results also indicate how vehicle-modeling fidelity may be improved to adequately verify assessed RNP level.
Asteroid-Generated Tsunami and Impact Risk
NASA Astrophysics Data System (ADS)
Boslough, M.; Aftosmis, M.; Berger, M. J.; Ezzedine, S. M.; Gisler, G.; Jennings, B.; LeVeque, R. J.; Mathias, D.; McCoy, C.; Robertson, D.; Titov, V. V.; Wheeler, L.
2016-12-01
The justification for planetary defense comes from a cost/benefit analysis, which includes risk assessment. The contribution from ocean impacts and airbursts is difficult to quantify and represents a significant uncertainty in our assessment of the overall risk. Our group is currently working toward improved understanding of impact scenarios that can generate dangerous tsunami. The importance of asteroid-generated tsunami research has increased because a new Science Definition Team, at the behest of NASA's Planetary Defense Coordinating Office, is now updating the results of a 2003 study on which our current planetary defense policy is based Our group was formed to address this question on many fronts, including asteroid entry modeling, tsunami generation and propagation simulations, modeling of coastal run-ups, inundation, and consequences, infrastructure damage estimates, and physics-based probabilistic impact risk assessment. We also organized the Second International Workshop on Asteroid Threat Assessment, focused on asteroid-generated tsunami and associated risk (Aug. 23-24, 2016). We will summarize our progress and present the highlights of our workshop, emphasizing its relevance to earth and planetary science. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE-AC04-94AL85000.
Fan, Qixiang; Qiang, Maoshan
2014-01-01
The concern for workers' safety in construction industry is reflected in many studies focusing on static safety risk identification and assessment. However, studies on real-time safety risk assessment aimed at reducing uncertainty and supporting quick response are rare. A method for real-time safety risk assessment (RTSRA) to implement a dynamic evaluation of worker safety states on construction site has been proposed in this paper. The method provides construction managers who are in charge of safety with more abundant information to reduce the uncertainty of the site. A quantitative calculation formula, integrating the influence of static and dynamic hazards and that of safety supervisors, is established to link the safety risk of workers with the locations of on-site assets. By employing the hidden Markov model (HMM), the RTSRA provides a mechanism for processing location data provided by the real-time location system (RTLS) and analyzing the probability distributions of different states in terms of false positives and negatives. Simulation analysis demonstrated the logic of the proposed method and how it works. Application case shows that the proposed RTSRA is both feasible and effective in managing construction project safety concerns. PMID:25114958
Assessing cadmium exposure risks of vegetables with plant uptake factor and soil property.
Yang, Yang; Chang, Andrew C; Wang, Meie; Chen, Weiping; Peng, Chi
2018-07-01
Plant uptake factors (PUFs) are of great importance in human cadmium (Cd) exposure risk assessment while it has been often treated in a generic way. We collected 1077 pairs of vegetable-soil samples from production fields to characterize Cd PUFs and demonstrated their utility in assessing Cd exposure risks to consumers of locally grown vegetables. The Cd PUFs varied with plant species and pH and organic matter content of soils. Once normalized PUFs against soil parameters, the PUFs distributions were log-normal in nature. In this manner, the PUFs were represented by definable probability distributions instead of a deterministic figure. The Cd exposure risks were then assessed using the normalized PUF based on the Monte Carlo simulation algorithm. Factors affecting the extent of Cd exposures were isolated through sensitivity analyses. Normalized PUF would illustrate the outcomes for uncontaminated and slightly contaminated soils. Among the vegetables, lettuce was potentially hazardous for residents due to its high Cd accumulation but low Zn concentration. To protect 95% of the lettuce production from causing excessive Cd exposure risks, pH of soils needed to be 5.9 and above. Copyright © 2018 Elsevier Ltd. All rights reserved.
Jiang, Hanchen; Lin, Peng; Fan, Qixiang; Qiang, Maoshan
2014-01-01
The concern for workers' safety in construction industry is reflected in many studies focusing on static safety risk identification and assessment. However, studies on real-time safety risk assessment aimed at reducing uncertainty and supporting quick response are rare. A method for real-time safety risk assessment (RTSRA) to implement a dynamic evaluation of worker safety states on construction site has been proposed in this paper. The method provides construction managers who are in charge of safety with more abundant information to reduce the uncertainty of the site. A quantitative calculation formula, integrating the influence of static and dynamic hazards and that of safety supervisors, is established to link the safety risk of workers with the locations of on-site assets. By employing the hidden Markov model (HMM), the RTSRA provides a mechanism for processing location data provided by the real-time location system (RTLS) and analyzing the probability distributions of different states in terms of false positives and negatives. Simulation analysis demonstrated the logic of the proposed method and how it works. Application case shows that the proposed RTSRA is both feasible and effective in managing construction project safety concerns.
A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas
Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan
2016-01-01
Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202
Mapping intra-urban transmission risk of dengue fever with big hourly cellphone data.
Mao, Liang; Yin, Ling; Song, Xiaoqing; Mei, Shujiang
2016-10-01
Cellphone tracking has been recently integrated into risk assessment of disease transmission, because travel behavior of disease carriers can be depicted in unprecedented details. Still in its infancy, such an integration has been limited to: 1) risk assessment only at national and provincial scales, where intra-urban human movements are neglected, and 2) using irregularly logged cellphone data that miss numerous user movements. Furthermore, few risk assessments have considered positional uncertainty of cellphone data. This study proposed a new framework for mapping intra-urban disease risk with regularly logged cellphone tracking data, taking the dengue fever in Shenzhen city as an example. Hourly tracking records of 5.85 million cellphone users, combined with the random forest classification and mosquito activities, were utilized to estimate the local transmission risk of dengue fever and the importation risk through travels. Stochastic simulations were further employed to quantify the uncertainty of risk. The resultant maps suggest targeted interventions to maximally reduce dengue cases exported to other places, as well as appropriate interventions to contain risk in places that import them. Given the popularity of cellphone use in urbanized areas, this framework can be adopted by other cities to design spatio-temporally resolved programs for disease control. Copyright © 2016 Elsevier B.V. All rights reserved.
Educating Youth about AIDS: A Model Program.
ERIC Educational Resources Information Center
Amer-Hirsch, Wendy
1989-01-01
Describes a New York Girls Club program designed to educate children and young adults about AIDS. Program involves use of prevention posters, puzzles, compositions, simulated game shows, debates, problem-solving and role-playing exercises, risk assessment exercises, and rap groups. (RJC)
Simulating the Risk of Liver Fluke Infection using a Mechanistic Hydro-epidemiological Model
NASA Astrophysics Data System (ADS)
Beltrame, Ludovica; Dunne, Toby; Rose, Hannah; Walker, Josephine; Morgan, Eric; Vickerman, Peter; Wagener, Thorsten
2016-04-01
Liver Fluke (Fasciola hepatica) is a common parasite found in livestock and responsible for considerable economic losses throughout the world. Risk of infection is strongly influenced by climatic and hydrological conditions, which characterise the host environment for parasite development and transmission. Despite on-going control efforts, increases in fluke outbreaks have been reported in recent years in the UK, and have been often attributed to climate change. Currently used fluke risk models are based on empirical relationships derived between historical climate and incidence data. However, hydro-climate conditions are becoming increasingly non-stationary due to climate change and direct anthropogenic impacts such as land use change, making empirical models unsuitable for simulating future risk. In this study we introduce a mechanistic hydro-epidemiological model for Liver Fluke, which explicitly simulates habitat suitability for disease development in space and time, representing the parasite life cycle in connection with key environmental conditions. The model is used to assess patterns of Liver Fluke risk for two catchments in the UK under current and potential future climate conditions. Comparisons are made with a widely used empirical model employing different datasets, including data from regional veterinary laboratories. Results suggest that mechanistic models can achieve adequate predictive ability and support adaptive fluke control strategies under climate change scenarios.
Mishra, Harshit; Karmakar, Subhankar; Kumar, Rakesh; Kadambala, Praneeth
2018-01-01
The handling and management of municipal solid waste (MSW) are major challenges for solid waste management in developing countries. Open dumping is still the most common waste disposal method in India. However, landfilling also causes various environmental, social, and human health impacts. The generation of heavily polluted leachate is a major concern to public health. Engineered barrier systems (EBSs) are commonly used to restrict potentially harmful wastes by preventing the leachate percolation to groundwater and overflow to surface water bodies. The EBSs are made of natural (e.g., soil, clay) and/or synthetic materials such as polymeric materials (e.g., geomembranes, geosynthetic clay liners) by arranging them in layers. Various studies have estimated the human health risk from leachate-contaminated groundwater. However, no studies have been reported to compare the human health risks, particularly due to the leachate contamination with different liner systems. The present study endeavors to quantify the human health risk to contamination from MSW landfill leachate using multiple simulations for various EBSs. To quantify the variation in health risks to groundwater consumption to the child and adult populations, the Turbhe landfill of Navi Mumbai in India has been selected. The leachate and groundwater samples were collected continuously throughout January-September in 2015 from the landfill site, and heavy metal concentrations were analyzed using an inductively coupled plasma system. The LandSim 2.5 Model, a landfill simulator, was used to simulate the landfill activities for various time slices, and non-carcinogenic human health risk was determined for selected heavy metals. Further, the uncertainties associated with multiple input parameters in the health risk model were quantified under a Monte Carlo simulation framework.
Relative risk assessment of cruise ships biosolids disposal alternatives.
Avellaneda, Pedro M; Englehardt, James D; Olascoaga, Josefina; Babcock, Elizabeth A; Brand, Larry; Lirman, Diego; Rogge, Wolfgang F; Solo-Gabriele, Helena; Tchobanoglous, George
2011-10-01
A relative risk assessment of biosolids disposal alternatives for cruise ships is presented in this paper. The area of study encompasses islands and marine waters of the Caribbean Sea. The objective was to evaluate relative human health and ecological risks of (a) dewatering/incineration, (b) landing the solids for disposal, considering that in some countries land-disposed solids might be discharged in the near-shore environment untreated, and (c) deep ocean disposal. Input to the Bayesian assessment consisted of professional judgment based on available literature and modeling information, data on constituent concentrations in cruise ship biosolids, and simulations of constituent concentrations in Caribbean waters assuming ocean disposal. Results indicate that human health and ecological risks associated with land disposal and shallow ocean disposal are higher than those of the deep ocean disposal and incineration. For incineration, predicted ecological impacts were lower relative to deep ocean disposal before considering potential impacts of carbon emissions. Copyright © 2011 Elsevier Ltd. All rights reserved.
Warila, James; Batterman, Stuart; Passino-Reader, Dora R.
2001-01-01
Silver (Ag) is discharged in wastewater effluents and is also a component in a proposed secondary water disinfectant. A steady-state model was developed to simulate bioaccumulation in aquatic biota and assess ecological and human health risks. Trophic levels included phytoplankton, invertebrates, brown trout, and common carp. Uptake routes included water, food, or sediment. Based on an extensive review of the literature, distributions were derived for most inputs for use in Monte Carlo simulations. Three scenarios represented ranges of dilution and turbidity. Compared with the limited field data available, median estimates of Ag in carp (0.07-2.1 Iμg/g dry weight) were 0.5 to 9 times measured values, and all measurements were within the predicted interquartile range. Median Ag concentrations in biota were ranked invertebrates > phytoplankton > trout > carp. Biotic concentrations were highest for conditions of low dilution and low turbidity. Critical variables included Ag assimilation eficiency, specific feeding rate, and the phytoplankton bioconcentration factor. Bioaccumulation of Ag seems unlikely to result in txicity to aquatic biota and humans consuming fish. Although the highest predicted Ag concentrations in water (>200 ng/L) may pose chronic risks to early survival and development of salmonids and risks of argyria to subsistence fishers, these results occur under highly conservative conditions.
A Basis Function Approach to Simulate Storm Surge Events for Coastal Flood Risk Assessment
NASA Astrophysics Data System (ADS)
Wu, Wenyan; Westra, Seth; Leonard, Michael
2017-04-01
Storm surge is a significant contributor to flooding in coastal and estuarine regions, especially when it coincides with other flood producing mechanisms, such as extreme rainfall. Therefore, storm surge has always been a research focus in coastal flood risk assessment. Often numerical models have been developed to understand storm surge events for risk assessment (Kumagai et al. 2016; Li et al. 2016; Zhang et al. 2016) (Bastidas et al. 2016; Bilskie et al. 2016; Dalledonne and Mayerle 2016; Haigh et al. 2014; Kodaira et al. 2016; Lapetina and Sheng 2015), and assess how these events may change or evolve in the future (Izuru et al. 2015; Oey and Chou 2016). However, numeric models often require a lot of input information and difficulties arise when there are not sufficient data available (Madsen et al. 2015). Alternative, statistical methods have been used to forecast storm surge based on historical data (Hashemi et al. 2016; Kim et al. 2016) or to examine the long term trend in the change of storm surge events, especially under climate change (Balaguru et al. 2016; Oh et al. 2016; Rueda et al. 2016). In these studies, often the peak of surge events is used, which result in the loss of dynamic information within a tidal cycle or surge event (i.e. a time series of storm surge values). In this study, we propose an alternative basis function (BF) based approach to examine the different attributes (e.g. peak and durations) of storm surge events using historical data. Two simple two-parameter BFs were used: the exponential function and the triangular function. High quality hourly storm surge record from 15 tide gauges around Australia were examined. It was found that there are significantly location and seasonal variability in the peak and duration of storm surge events, which provides additional insights in coastal flood risk. In addition, the simple form of these BFs allows fast simulation of storm surge events and minimises the complexity of joint probability analysis for flood risk analysis considering multiple flood producing mechanisms. This is the first step in applying a Monte Carlo based joint probability method for flood risk assessment.
The Value of Biomedical Simulation Environments to Future Human Space Flight Missions
NASA Technical Reports Server (NTRS)
Mulugeta,Lealem; Myers, Jerry G.; Lewandowski, Beth; Platts, Steven H.
2011-01-01
Mars and NEO missions will expose astronaut to extended durations of reduced reduced gravity, isolation and higher radiation. These new operation conditions pose health risks that are not well understood and perhaps unanticipated. Advanced computational simulation environments can beneficially augment research to predict, assess and mitigate potential hazards to astronaut health. The NASA Digital Astronaut Project (DAP), within the NASA Human Research Program, strives to achieve this goal.
Identifying content for simulation-based curricula in urology: a national needs assessment.
Nayahangan, Leizl Joy; Bølling Hansen, Rikke; Gilboe Lindorff-Larsen, Karen; Paltved, Charlotte; Nielsen, Bjørn Ulrik; Konge, Lars
2017-12-01
Simulation-based training is well recognized in the transforming field of urological surgery; however, integration into the curriculum is often unstructured. Development of simulation-based curricula should follow a stepwise approach starting with a needs assessment. This study aimed to identify technical procedures in urology that should be included in a simulation-based curriculum for residency training. A national needs assessment was performed using the Delphi method involving 56 experts with significant roles in the education of urologists. Round 1 identified technical procedures that newly qualified urologists should perform. Round 2 included a survey using an established needs assessment formula to explore: the frequency of procedures; the number of physicians who should be able to perform the procedure; the risk and/or discomfort to patients when a procedure is performed by an inexperienced physician; and the feasibility of simulation training. Round 3 involved elimination and reranking of procedures according to priority. The response rates for the three Delphi rounds were 70%, 55% and 67%, respectively. The 34 procedures identified in Round 1 were reduced to a final prioritized list of 18 technical procedures for simulation-based training. The five procedures that reached the highest prioritization were cystoscopy, transrectal ultrasound-guided biopsy of the prostate, placement of ureteral stent, insertion of urethral and suprapubic catheter, and transurethral resection of the bladder. The prioritized list of technical procedures in urology that were identified as highly suitable for simulation can be used as an aid in the planning and development of simulation-based training programs.
Quantitative risk assessment of Cryptosporidium in tap water in Ireland.
Cummins, E; Kennedy, R; Cormican, M
2010-01-15
Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used. Copyright 2009 Elsevier B.V. All rights reserved.
Nonparametric estimation of benchmark doses in environmental risk assessment
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133
Li, Xiaomeng; Yan, Xuedong; Wu, Jiawei; Radwan, Essam; Zhang, Yuting
2016-12-01
Driver's collision avoidance performance has a direct link to the collision risk and crash severity. Previous studies demonstrated that the distracted driving, such as using a cell phone while driving, disrupted the driver's performance on road. This study aimed to investigate the manner and extent to which cell phone use and driver's gender affected driving performance and collision risk in a rear-end collision avoidance process. Forty-two licensed drivers completed the driving simulation experiment in three phone use conditions: no phone use, hands-free, and hand-held, in which the drivers drove in a car-following situation with potential rear-end collision risks caused by the leading vehicle's sudden deceleration. Based on the experiment data, a rear-end collision risk assessment model was developed to assess the influence of cell phone use and driver's gender. The cell phone use and driver's gender were found to be significant factors that affected the braking performances in the rear-end collision avoidance process, including the brake reaction time, the deceleration adjusting time and the maximum deceleration rate. The minimum headway distance between the leading vehicle and the simulator during the rear-end collision avoidance process was the final output variable, which could be used to measure the rear-end collision risk and judge whether a collision occurred. The results showed that although cell phone use drivers took some compensatory behaviors in the collision avoidance process to reduce the mental workload, the collision risk in cell phone use conditions was still higher than that without the phone use. More importantly, the results proved that the hands-free condition did not eliminate the safety problem associated with distracted driving because it impaired the driving performance in the same way as much as the use of hand-held phones. In addition, the gender effect indicated that although female drivers had longer reaction time than male drivers in critical situation, they were more quickly in braking with larger maximum deceleration rate, and they tended to keep a larger safety margin with the leading vehicle compared to male drivers. The findings shed some light on the further development of advanced collision avoidance technologies and the targeted intervention strategies about cell phone use while driving. Copyright © 2016 Elsevier Ltd. All rights reserved.
Gamado, Kokouvi; Marion, Glenn; Porphyre, Thibaud
2017-01-01
Livestock epidemics have the potential to give rise to significant economic, welfare, and social costs. Incursions of emerging and re-emerging pathogens may lead to small and repeated outbreaks. Analysis of the resulting data is statistically challenging but can inform disease preparedness reducing potential future losses. We present a framework for spatial risk assessment of disease incursions based on data from small localized historic outbreaks. We focus on between-farm spread of livestock pathogens and illustrate our methods by application to data on the small outbreak of Classical Swine Fever (CSF) that occurred in 2000 in East Anglia, UK. We apply models based on continuous time semi-Markov processes, using data-augmentation Markov Chain Monte Carlo techniques within a Bayesian framework to infer disease dynamics and detection from incompletely observed outbreaks. The spatial transmission kernel describing pathogen spread between farms, and the distribution of times between infection and detection, is estimated alongside unobserved exposure times. Our results demonstrate inference is reliable even for relatively small outbreaks when the data-generating model is known. However, associated risk assessments depend strongly on the form of the fitted transmission kernel. Therefore, for real applications, methods are needed to select the most appropriate model in light of the data. We assess standard Deviance Information Criteria (DIC) model selection tools and recently introduced latent residual methods of model assessment, in selecting the functional form of the spatial transmission kernel. These methods are applied to the CSF data, and tested in simulated scenarios which represent field data, but assume the data generation mechanism is known. Analysis of simulated scenarios shows that latent residual methods enable reliable selection of the transmission kernel even for small outbreaks whereas the DIC is less reliable. Moreover, compared with DIC, model choice based on latent residual assessment correlated better with predicted risk. PMID:28293559
Quantifying the risk of extreme aviation accidents
NASA Astrophysics Data System (ADS)
Das, Kumer Pial; Dey, Asim Kumer
2016-12-01
Air travel is considered a safe means of transportation. But when aviation accidents do occur they often result in fatalities. Fortunately, the most extreme accidents occur rarely. However, 2014 was the deadliest year in the past decade causing 111 plane crashes, and among them worst four crashes cause 298, 239, 162 and 116 deaths. In this study, we want to assess the risk of the catastrophic aviation accidents by studying historical aviation accidents. Applying a generalized Pareto model we predict the maximum fatalities from an aviation accident in future. The fitted model is compared with some of its competitive models. The uncertainty in the inferences are quantified using simulated aviation accident series, generated by bootstrap resampling and Monte Carlo simulations.
Comprehensive risk assessment method of catastrophic accident based on complex network properties
NASA Astrophysics Data System (ADS)
Cui, Zhen; Pang, Jun; Shen, Xiaohong
2017-09-01
On the macro level, the structural properties of the network and the electrical characteristics of the micro components determine the risk of cascading failures. And the cascading failures, as a process with dynamic development, not only the direct risk but also potential risk should be considered. In this paper, comprehensively considered the direct risk and potential risk of failures based on uncertain risk analysis theory and connection number theory, quantified uncertain correlation by the node degree and node clustering coefficient, then established a comprehensive risk indicator of failure. The proposed method has been proved by simulation on the actual power grid. Modeling a network according to the actual power grid, and verified the rationality of the proposed method.
The NASA Space Radiation Research Program
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.
2006-01-01
We present a comprehensive overview of the NASA Space Radiation Research Program. This program combines basic research on the mechanisms of radiobiological action relevant for improving knowledge of the risks of cancer, central nervous system and other possible degenerative tissue effects, and acute radiation syndromes from space radiation. The keystones of the NASA Program are five NASA Specialized Center's of Research (NSCOR) investigating space radiation risks. Other research is carried out through peer-reviewed individual investigations and in collaboration with the US Department of Energies Low-Dose Research Program. The Space Radiation Research Program has established the Risk Assessment Project to integrate data from the NSCOR s and other peer-reviewed research into quantitative projection models with the goals of steering research into data and scientific breakthroughs that will reduce the uncertainties in current risk projections and developing the scientific knowledge needed for future individual risk assessment approaches and biological countermeasure assessments or design. The NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory was created by the Program to simulate space radiation on the ground in support of the above research programs. New results from NSRL will be described.
Dropout during a driving simulator study: A survival analysis.
Matas, Nicole A; Nettelbeck, Ted; Burns, Nicholas R
2015-12-01
Simulator sickness is the occurrence of motion-sickness like symptoms that can occur during use of simulators and virtual reality technologies. This study investigated individual factors that contributed to simulator sickness and dropout while using a desktop driving simulator. Eighty-eight older adult drivers (mean age 72.82±5.42years) attempted a practice drive and two test drives. Participants also completed a battery of cognitive and visual assessments, provided information on their health and driving habits, and reported their experience of simulator sickness symptoms throughout the study. Fifty-two participants dropped out before completing the driving tasks. A time-dependent Cox Proportional Hazards model showed that female gender (HR=2.02), prior motion sickness history (HR=2.22), and Mini-SSQ score (HR=1.55) were associated with dropout. There were no differences between dropouts and completers on any of the cognitive abilities tests. Older adults are a high-risk group for simulator sickness. Within this group, female gender and prior motion sickness history are related to simulator dropout. Higher reported experience of symptoms of simulator sickness increased rates of dropout. The results highlight the importance of screening and monitoring of participants in driving simulation studies. Older adults, females, and those with a prior history of motion sickness may be especially at risk. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
Myer, Gregory D.; Brent, Jensen L.; Ford, Kevin R.; Hewett, Timothy E.
2011-01-01
Lead Summary Some athletes may be more susceptible to at-risk knee positions during sports activities, but the underlying causes are not clearly defined. This manuscripts synthesizes in vivo, in vitro and in-silica (computer simulated) data to delineate likely risk factors to the mechanism(s) of non-contact ACL injuries. From these identified risk factors, we will discuss newly developed real-time screening techniques that can be used in training sessions to identify modifiable risk factors. Techniques provided will target and correct altered mechanics which may reduce or eliminate risk factors and aid in the prevention of non-contact ACL injuries in high risk athletes. PMID:21643474
App Usage Factor: A Simple Metric to Compare the Population Impact of Mobile Medical Apps.
Lewis, Thomas Lorchan; Wyatt, Jeremy C
2015-08-19
One factor when assessing the quality of mobile apps is quantifying the impact of a given app on a population. There is currently no metric which can be used to compare the population impact of a mobile app across different health care disciplines. The objective of this study is to create a novel metric to characterize the impact of a mobile app on a population. We developed the simple novel metric, app usage factor (AUF), defined as the logarithm of the product of the number of active users of a mobile app with the median number of daily uses of the app. The behavior of this metric was modeled using simulated modeling in Python, a general-purpose programming language. Three simulations were conducted to explore the temporal and numerical stability of our metric and a simulated app ecosystem model using a simulated dataset of 20,000 apps. Simulations confirmed the metric was stable between predicted usage limits and remained stable at extremes of these limits. Analysis of a simulated dataset of 20,000 apps calculated an average value for the app usage factor of 4.90 (SD 0.78). A temporal simulation showed that the metric remained stable over time and suitable limits for its use were identified. A key component when assessing app risk and potential harm is understanding the potential population impact of each mobile app. Our metric has many potential uses for a wide range of stakeholders in the app ecosystem, including users, regulators, developers, and health care professionals. Furthermore, this metric forms part of the overall estimate of risk and potential for harm or benefit posed by a mobile medical app. We identify the merits and limitations of this metric, as well as potential avenues for future validation and research.
NASA Astrophysics Data System (ADS)
Guo, B.
2017-12-01
Mountain watershed in Western China is prone to flash floods. The Wenchuan earthquake on May 12, 2008 led to the destruction of surface, and frequent landslides and debris flow, which further exacerbated the flash flood hazards. Two giant torrent and debris flows occurred due to heavy rainfall after the earthquake, one was on August 13 2010, and the other on August 18 2010. Flash floods reduction and risk assessment are the key issues in post-disaster reconstruction. Hydrological prediction models are important and cost-efficient mitigation tools being widely applied. In this paper, hydrological observations and simulation using remote sensing data and the WMS model are carried out in the typical flood-hit area, Longxihe watershed, Dujiangyan City, Sichuan Province, China. The hydrological response of rainfall runoff is discussed. The results show that: the WMS HEC-1 model can well simulate the runoff process of small watershed in mountainous area. This methodology can be used in other earthquake-affected areas for risk assessment and to predict the magnitude of flash floods. Key Words: Rainfall-runoff modeling. Remote Sensing. Earthquake. WMS.
Uncertainty and risk in wildland fire management: a review.
Thompson, Matthew P; Calkin, Dave E
2011-08-01
Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.
THREE-DIMENSIONAL COMPUTER MODELING OF THE HUMAN UPPER RESPIRATORY TRACT
ABSTRACT
Computer simulations of airflow and particle transport phenomena within the human respiratory system have important applications to aerosol therapy (e.g., the targeted delivery of inhaled drugs) and inhalation toxicology (e.g., the risk assessment of air pollutants). ...
Sensitivity analysis for simulating pesticide impacts on honey bee colonies
Background/Question/Methods Regulatory agencies assess risks to honey bees from pesticides through a tiered process that includes predictive modeling with empirical toxicity and chemical data of pesticides as a line of evidence. We evaluate the Varroapop colony model, proposed by...
76 FR 5691 - Cyprodinil; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-02
....'' This includes exposure through drinking water and in residential settings, but does not include... exposure from drinking water. The Agency used screening level water exposure models in the dietary exposure analysis and risk assessment for cyprodinil in drinking water. These simulation models take into account...
75 FR 17579 - Aminopyralid; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
... exposure through drinking water and in residential settings, but does not include occupational exposure... from drinking water. The Agency used screening level water exposure models in the dietary exposure analysis and risk assessment for aminopyralid in drinking water. These simulation models take into account...
Multiscale modeling and simulation of embryogenesis for in silico predictive toxicology (WC9)
Translating big data from alternative and HTS platforms into hazard identification and risk assessment is an important need for predictive toxicology and for elucidating adverse outcome pathways (AOPs) in developmental toxicity. Understanding how chemical disruption of molecular ...
Dixit, Prakash N; Telleria, Roberto
2015-04-01
Inter-annual and seasonal variability in climatic parameters, most importantly rainfall, have potential to cause climate-induced risk in long-term crop production. Short-term field studies do not capture the full nature of such risk and the extent to which modifications to crop, soil and water management recommendations may be made to mitigate the extent of such risk. Crop modeling studies driven by long-term daily weather data can predict the impact of climate-induced risk on crop growth and yield however, the availability of long-term daily weather data can present serious constraints to the use of crop models. To tackle this constraint, two weather generators namely, LARS-WG and MarkSim, were evaluated in order to assess their capabilities of reproducing frequency distributions, means, variances, dry spell and wet chains of observed daily precipitation, maximum and minimum temperature, and solar radiation for the eight locations across cropping areas of Northern Syria and Lebanon. Further, the application of generated long-term daily weather data, with both weather generators, in simulating barley growth and yield was also evaluated. We found that overall LARS-WG performed better than MarkSim in generating daily weather parameters and in 50 years continuous simulation of barley growth and yield. Our findings suggest that LARS-WG does not necessarily require long-term e.g., >30 years observed weather data for calibration as generated results proved to be satisfactory with >10 years of observed data except in area with higher altitude. Evaluating these weather generators and the ability of generated weather data to perform long-term simulation of crop growth and yield is an important first step to assess the impact of future climate on yields, and to identify promising technologies to make agricultural systems more resilient in the given region. Copyright © 2015 Elsevier B.V. All rights reserved.
Sayers, Adrian; Crowther, Michael J; Judge, Andrew; Whitehouse, Michael R; Blom, Ashley W
2017-08-28
The use of benchmarks to assess the performance of implants such as those used in arthroplasty surgery is a widespread practice. It provides surgeons, patients and regulatory authorities with the reassurance that implants used are safe and effective. However, it is not currently clear how or how many implants should be statistically compared with a benchmark to assess whether or not that implant is superior, equivalent, non-inferior or inferior to the performance benchmark of interest.We aim to describe the methods and sample size required to conduct a one-sample non-inferiority study of a medical device for the purposes of benchmarking. Simulation study. Simulation study of a national register of medical devices. We simulated data, with and without a non-informative competing risk, to represent an arthroplasty population and describe three methods of analysis (z-test, 1-Kaplan-Meier and competing risks) commonly used in surgical research. We evaluate the performance of each method using power, bias, root-mean-square error, coverage and CI width. 1-Kaplan-Meier provides an unbiased estimate of implant net failure, which can be used to assess if a surgical device is non-inferior to an external benchmark. Small non-inferiority margins require significantly more individuals to be at risk compared with current benchmarking standards. A non-inferiority testing paradigm provides a useful framework for determining if an implant meets the required performance defined by an external benchmark. Current contemporary benchmarking standards have limited power to detect non-inferiority, and substantially larger samples sizes, in excess of 3200 procedures, are required to achieve a power greater than 60%. It is clear when benchmarking implant performance, net failure estimated using 1-KM is preferential to crude failure estimated by competing risk models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Assessing Landscape Scale Wildfire Exposure for Highly Valued Resources in a Mediterranean Area
NASA Astrophysics Data System (ADS)
Alcasena, Fermín J.; Salis, Michele; Ager, Alan A.; Arca, Bachisio; Molina, Domingo; Spano, Donatella
2015-05-01
We used a fire simulation modeling approach to assess landscape scale wildfire exposure for highly valued resources and assets (HVR) on a fire-prone area of 680 km2 located in central Sardinia, Italy. The study area was affected by several wildfires in the last half century: some large and intense fire events threatened wildland urban interfaces as well as other socioeconomic and cultural values. Historical wildfire and weather data were used to inform wildfire simulations, which were based on the minimum travel time algorithm as implemented in FlamMap. We simulated 90,000 fires that replicated recent large fire events in the area spreading under severe weather conditions to generate detailed maps of wildfire likelihood and intensity. Then, we linked fire modeling outputs to a geospatial risk assessment framework focusing on buffer areas around HVR. The results highlighted a large variation in burn probability and fire intensity in the vicinity of HVRs, and allowed us to identify the areas most exposed to wildfires and thus to a higher potential damage. Fire intensity in the HVR buffers was mainly related to fuel types, while wind direction, topographic features, and historically based ignition pattern were the key factors affecting fire likelihood. The methodology presented in this work can have numerous applications, in the study area and elsewhere, particularly to address and inform fire risk management, landscape planning and people safety on the vicinity of HVRs.
NASA Astrophysics Data System (ADS)
Murphy, K. W.; Ellis, A. W.; Skindlov, J. A.
2015-12-01
Water resource systems have provided vital support to transformative growth in the Southwest United States and the Phoenix, Arizona metropolitan area where the Salt River Project (SRP) currently satisfies 40% of the area's water demand from reservoir storage and groundwater. Large natural variability and expectations of climate changes have sensitized water management to risks posed by future periods of excess and drought. The conventional approach to impacts assessment has been downscaled climate model simulations translated through hydrologic models; but, scenario ranges enlarge as uncertainties propagate through sequential levels of modeling complexity. The research often does not reach the stage of specific impact assessments, rendering future projections frustratingly uncertain and unsuitable for complex decision-making. Alternatively, this study inverts the common approach by beginning with the threatened water system and proceeding backwards to the uncertain climate future. The methodology is built upon reservoir system response modeling to exhaustive time series of climate-driven net basin supply. A reservoir operations model, developed with SRP guidance, assesses cumulative response to inflow variability and change. Complete statistical analyses of long-term historical watershed climate and runoff data are employed for 10,000-year stochastic simulations, rendering the entire range of multi-year extremes with full probabilistic characterization. Sets of climate change projections are then translated by temperature sensitivity and precipitation elasticity into future inflow distributions that are comparatively assessed with the reservoir operations model. This approach provides specific risk assessments in pragmatic terms familiar to decision makers, interpretable within the context of long-range planning and revealing a clearer meaning of climate change projections for the region. As a transferable example achieving actionable findings, the approach can guide other communities confronting water resource planning challenges.
Modeling of Flood Risk for the Continental United States
NASA Astrophysics Data System (ADS)
Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.
2011-12-01
The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from the flood hazard model is used to drive a flood loss model that is coupled to a financial model.
Flood risk and cultural heritage: the case study of Florence (Italy)
NASA Astrophysics Data System (ADS)
Arrighi, Chiara; Castelli, Fabio; Brugioni, Marcello; Franceschini, Serena; Mazzanti, Bernardo
2016-04-01
Cultural heritage plays a key role for communities in terms of both identity and economic value. It is often under serious threat by natural hazards, nevertheless, quantitative assessments of risk are quite uncommon. This work addresses the flood risk assessment to cultural heritage in an exemplary art city, which is Florence, Italy. The risk assessment method here adopted borrows the most common definition of flood risk as the product of hazard, vulnerability and exposure, with some necessary adjustments. The risk estimation is carried out at the building scale for the whole UNESCO site, which coincides with the historical centre of the city. A distinction in macro- and micro-damage categories has been made according to the vulnerability of the objects at risk. Two damage macro-categories are selected namely cultural buildings and contents. Cultural buildings are classified in damage micro-categories as churches/religious complexes, libraries/archives and museums. The damages to the contents are estimated for four micro-categories: paintings, sculptures, books/prints and goldsmith's art. Data from hydraulic simulations for different recurrence scenarios, historical reports of the devastating 1966 flood and the cultural heritage recognition sheets allow estimating and mapping the annual expected number of works of art lost in absence of risk mitigation strategies.
Simulating cholinesterase inhibition in birds caused by dietary insecticide exposure
Corson, M.S.; Mora, M.A.; Grant, W.E.
1998-01-01
We describe a stochastic simulation model that simulates avian foraging in an agricultural landscape to evaluate factors affecting dietary insecticide exposure and to predict post-exposure cholinesterase (ChE) inhibition. To evaluate the model, we simulated published field studies and found that model predictions of insecticide decay and ChE inhibition reasonably approximated most observed results. Sensitivity analysis suggested that foraging location usually influenced ChE inhibition more than diet preferences or daily intake rate. Although organophosphorus insecticides usually caused greater inhibition than carbamate insecticides, insecticide toxicity appeared only moderately important. When we simulated impact of heavy insecticide applications during breeding seasons of 15 wild bird species, mean maximum ChE inhibition in most species exceeded 20% at some point. At this level of inhibition, birds may experience nausea and/or may exhibit minor behavioral changes. Simulated risk peaked in April–May and August–September and was lowest in July. ChE inhibition increased with proportion of vegetation in the diet. This model, and ones like it, may help predict insecticide exposure of and sublethal ChE inhibition in grassland animals, thereby reducing dependence of ecological risk assessments on field studies alone.
LeBlanc, Vicki R; Regehr, Cheryl; Shlonsky, Aron; Bogo, Marion
2012-05-01
The assessment of children at risk of abuse and neglect is a critical societal function performed by child protection workers in situations of acute stress and conflict. Despite efforts to improve the reliability of risk assessments through standardized measures, available tools continue to rely on subjective judgment. The goal of this study was to assess the stress responses of child protection workers and their assessments of risk in high conflict situations. Ninety-six child protection workers participated in 2 simulated scenarios, 1 non-confrontational and 1 confrontational. In each scenario, participants conducted a 15-minute interview with a mother played by a specially trained actor. Following the interview, the workers completed 2 risk assessment measures used in the field at the time of the study. Anxiety was measured by the State-Trait Anxiety Inventory at baseline and immediately following the completion of each interview. Physiological stress as measured by salivary cortisol was obtained at baseline as well as 20 and 30 minutes after the start of each interview. Participants demonstrated significant stress responses during the 1st scenario, regardless of whether the interview was confrontational or not. During the second scenario, the participants did not exhibit significant cortisol responses, however the confrontational interview elicited greater subjective anxiety than the non-confrontational scenario. In the first scenario, in which the workers demonstrated greater stress responses, risk assessment scores were higher on one risk assessment tool for the confrontational scenario than for the non-confrontational scenario. The results suggest that stress responses in child protection workers appear to be influenced by the novelty of a situation and by a parent's demeanor during interviews. Some forms of risk assessment tools appear to be more strongly associated than other with the workers' subjective and physiological stress responses. This merits further research to determine which aspects of risk assessment tools are susceptible to the emotional elements of intake interviews. Copyright © 2012. Published by Elsevier Ltd.
Nayahangan, L J; Konge, L; Schroeder, T V; Paltved, C; Lindorff-Larsen, K G; Nielsen, B U; Eiberg, J P
2017-04-01
Practical skills training in vascular surgery is facing challenges because of an increased number of endovascular procedures and fewer open procedures, as well as a move away from the traditional principle of "learning by doing." This change has established simulation as a cornerstone in providing trainees with the necessary skills and competences. However, the development of simulation based programs often evolves based on available resources and equipment, reflecting convenience rather than a systematic educational plan. The objective of the present study was to perform a national needs assessment to identify the technical procedures that should be integrated in a simulation based curriculum. A national needs assessment using a Delphi process was initiated by engaging 33 predefined key persons in vascular surgery. Round 1 was a brainstorming phase to identify technical procedures that vascular surgeons should learn. Round 2 was a survey that used a needs assessment formula to explore the frequency of procedures, the number of surgeons performing each procedure, risk and/or discomfort, and feasibility for simulation based training. Round 3 involved elimination and ranking of procedures. The response rate for round 1 was 70%, with 36 procedures identified. Round 2 had a 76% response rate and resulted in a preliminary prioritised list after exploring the need for simulation based training. Round 3 had an 85% response rate; 17 procedures were eliminated, resulting in a final prioritised list of 19 technical procedures. A national needs assessment using a standardised Delphi method identified a list of procedures that are highly suitable and may provide the basis for future simulation based training programs for vascular surgeons in training. Copyright © 2017 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Simulated Driving Assessment (SDA) for Teen Drivers: Results from a Validation Study
McDonald, Catherine C.; Kandadai, Venk; Loeb, Helen; Seacrist, Thomas S.; Lee, Yi-Ching; Winston, Zachary; Winston, Flaura K.
2015-01-01
Background Driver error and inadequate skill are common critical reasons for novice teen driver crashes, yet few validated, standardized assessments of teen driving skills exist. The purpose of this study was to evaluate the construct and criterion validity of a newly developed Simulated Driving Assessment (SDA) for novice teen drivers. Methods The SDA's 35-minute simulated drive incorporates 22 variations of the most common teen driver crash configurations. Driving performance was compared for 21 inexperienced teens (age 16–17 years, provisional license ≤90 days) and 17 experienced adults (age 25–50 years, license ≥5 years, drove ≥100 miles per week, no collisions or moving violations ≤3 years). SDA driving performance (Error Score) was based on driving safety measures derived from simulator and eye-tracking data. Negative driving outcomes included simulated collisions or run-off-the-road incidents. A professional driving evaluator/instructor reviewed videos of SDA performance (DEI Score). Results The SDA demonstrated construct validity: 1.) Teens had a higher Error Score than adults (30 vs. 13, p=0.02); 2.) For each additional error committed, the relative risk of a participant's propensity for a simulated negative driving outcome increased by 8% (95% CI: 1.05–1.10, p<0.01). The SDA demonstrated criterion validity: Error Score was correlated with DEI Score (r=−0.66, p<0.001). Conclusions This study supports the concept of validated simulated driving tests like the SDA to assess novice driver skill in complex and hazardous driving scenarios. The SDA, as a standard protocol to evaluate teen driver performance, has the potential to facilitate screening and assessment of teen driving readiness and could be used to guide targeted skill training. PMID:25740939
Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.
2017-01-01
Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.
Human health risk assessment for nanoparticle-contaminated aquifer systems.
Tosco, Tiziana; Sethi, Rajandrea
2018-08-01
Nanosized particles (NPs), such as TiO 2 , Silver, graphene NPs, nanoscale zero-valent iron, carbon nanotubes, etc., are increasingly used in industrial processes, and releases at production plants and from landfills are likely scenarios for the next years. As a consequence, appropriate procedures and tools to quantify the risks for human health associated to these releases are needed. The tiered approach of the standard ASTM procedure (ASTM-E2081-00) is today the most applied for human health risk assessment at sites contaminated by chemical substances, but it cannot be directly applied to nanoparticles: NP transport along migration pathways follows mechanisms significantly different from those of chemicals; moreover, also toxicity indicators (namely, reference dose and slope factor) are NP-specific. In this work a risk assessment approach modified for NPs is proposed, with a specific application at Tier 2 to migration in groundwater. The standard ASTM equations are modified to include NP-specific transport mechanisms. NPs in natural environments are typically characterized by a heterogeneous set of NPs having different size, shape, coating, etc. (all properties having a significant impact on both mobility and toxicity). To take into account this heterogeneity, the proposed approach divides the NP population into classes, each having specific transport and toxicity properties, and simulates them as independent species. The approach is finally applied to a test case simulating the release of heterogeneous Silver NPs from a landfill. The results show that taking into account the size-dependent mobility of the particles provides a more accurate result compared to the direct application of the standard ASTM procedure. In particular, the latter tends to underestimate the overall toxic risk associated to the nP release. Copyright © 2018 Elsevier Ltd. All rights reserved.
Assessment of mean annual flood damage using simple hydraulic modeling and Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Oubennaceur, K.; Agili, H.; Chokmani, K.; Poulin, J.; Marceau, P.
2016-12-01
Floods are the most frequent and the most damaging natural disaster in Canada. The issue of assessing and managing the risk related to this disaster has become increasingly crucial for both local and national authorities. Brigham, a municipality located in southern Quebec Province, is one of the heavily affected regions by this disaster because of frequent overflows of the Yamaska River reaching two to three times per year. Since Irene Hurricane which struck the region in 2011, causing considerable socio-economic damage, the implementation of mitigation measures has become a major priority for this municipality. To do this, a preliminary study to evaluate the risk to which this region is exposed is essential. Conventionally, approaches only based on the characterization of the hazard (e.g. floodplains extensive, flood depth) are generally adopted to study the risk of flooding. In order to improve the knowledge of this risk, a Monte Carlo simulation approach combining information on the hazard with vulnerability-related aspects has been developed. This approach integrates three main components: (1) hydrologic modelling aiming to establish a probability-discharge function which associate each measured discharge to its probability of occurrence (2) hydraulic modeling that aims to establish the relationship between the discharge and the water stage at each building (3) damage study that aims to assess the buildings damage using damage functions. The damage is estimated according to the water depth defined as the difference between the water level and the elevation of the building's first floor. The application of the proposed approach allows estimating the annual average cost of damage caused by floods on buildings. The obtained results will be useful for authorities to support their decisions on risk management and prevention against this disaster.
NASA Technical Reports Server (NTRS)
Strutzenberg, L. L.; Dougherty, N. S.; Liever, P. A.; West, J. S.; Smith, S. D.
2007-01-01
This paper details advances being made in the development of Reynolds-Averaged Navier-Stokes numerical simulation tools, models, and methods for the integrated Space Shuttle Vehicle at launch. The conceptual model and modeling approach described includes the development of multiple computational models to appropriately analyze the potential debris transport for critical debris sources at Lift-Off. The conceptual model described herein involves the integration of propulsion analysis for the nozzle/plume flow with the overall 3D vehicle flowfield at Lift-Off. Debris Transport Analyses are being performed using the Shuttle Lift-Off models to assess the risk to the vehicle from Lift-Off debris and appropriately prioritized mitigation of potential debris sources to continue to reduce vehicle risk. These integrated simulations are being used to evaluate plume-induced debris environments where the multi-plume interactions with the launch facility can potentially accelerate debris particles toward the vehicle.
Ergonomics and simulation-based approach in improving facility layout
NASA Astrophysics Data System (ADS)
Abad, Jocelyn D.
2018-02-01
The use of the simulation-based technique in facility layout has been a choice in the industry due to its convenience and efficient generation of results. Nevertheless, the solutions generated are not capable of addressing delays due to worker's health and safety which significantly impact overall operational efficiency. It is, therefore, critical to incorporate ergonomics in facility design. In this study, workstation analysis was incorporated into Promodel simulation to improve the facility layout of a garment manufacturing. To test the effectiveness of the method, existing and improved facility designs were measured using comprehensive risk level, efficiency, and productivity. Results indicated that the improved facility layout generated a decrease in comprehensive risk level and rapid upper limb assessment score; an increase of 78% in efficiency and 194% increase in productivity compared to existing design and thus proved that the approach is effective in attaining overall facility design improvement.
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Fasanella, Edwin L.; Tabiei, Ala; Brinkley, James W.; Shemwell, David M.
2008-01-01
A review of astronaut whole body impact tolerance is discussed for land or water landings of the next generation manned space capsule named Orion. LS-DYNA simulations of Orion capsule landings are performed to produce a low, moderate, and high probability of injury. The paper evaluates finite element (FE) seat and occupant simulations for assessing injury risk for the Orion crew and compares these simulations to whole body injury models commonly referred to as the Brinkley criteria. The FE seat and crash dummy models allow for varying the occupant restraint systems, cushion materials, side constraints, flailing of limbs, and detailed seat/occupant interactions to minimize landing injuries to the crew. The FE crash test dummies used in conjunction with the Brinkley criteria provides a useful set of tools for predicting potential crew injuries during vehicle landings.
A numerical 4D Collision Risk Model
NASA Astrophysics Data System (ADS)
Schmitt, Pal; Culloch, Ross; Lieber, Lilian; Kregting, Louise
2017-04-01
With the growing number of marine renewable energy (MRE) devices being installed across the world, some concern has been raised about the possibility of harming mobile, marine fauna by collision. Although physical contact between a MRE device and an organism has not been reported to date, these novel sub-sea structures pose a challenge for accurately estimating collision risks as part of environmental impact assessments. Even if the animal motion is simplified to linear translation, ignoring likely evasive behaviour, the mathematical problem of establishing an impact probability is not trivial. We present a numerical algorithm to obtain such probability distributions using transient, four-dimensional simulations of a novel marine renewable device concept, Deep Green, Minesto's power plant and hereafter referred to as the 'kite' that flies in a figure-of-eight configuration. Simulations were carried out altering several configurations including kite depth, kite speed and kite trajectory while keeping the speed of the moving object constant. Since the kite assembly is defined as two parts in the model, a tether (attached to the seabed) and the kite, collision risk of each part is reported independently. By comparing the number of collisions with the number of collision-free simulations, a probability of impact for each simulated position in the cross- section of the area is considered. Results suggest that close to the bottom, where the tether amplitude is small, the path is always blocked and the impact probability is 100% as expected. However, higher up in the water column, the collision probability is twice as high in the mid line, where the tether passes twice per period than at the extremes of its trajectory. The collision probability distribution is much more complex in the upper end of the water column, where the kite and tether can simultaneously collide with the object. Results demonstrate the viability of such models, which can also incorporate empirical field data for assessing the probability of collision risk of animals with an MRE device under varying operating conditions.
Enhancing Human Responses to Climate Change Risks through Simulated Flooding Experiences
NASA Astrophysics Data System (ADS)
Zaalberg, Ruud; Midden, Cees
Delta areas are threatened by global climate change. The general aims of our research were (1) to increase our understanding of climate and flood risk perceptions and the factors that influence these judgments, and (2) to seek for interventions that can contribute to a realistic assessment by laypersons of long-term flooding risks. We argue that awareness of one's own vulnerability to future flooding and insights into the effectiveness of coping strategies is driven by direct flooding experiences. In the current research multimodal sensory stimulation by means of interactive 3D technology is used to simulate direct flooding experiences at the experiential or sensory level, thereby going beyond traditional persuasion attempts using fear-evoking images. Our results suggest that future communication efforts should not only use these new technologies to transfer knowledge about effective coping strategies and flooding risks, but should especially be directed towards residents living in flood prone areas, but who lack direct flooding experiences as their guiding principle.
Impact of task design on task performance and injury risk: case study of a simulated drilling task.
Alabdulkarim, Saad; Nussbaum, Maury A; Rashedi, Ehsan; Kim, Sunwook; Agnew, Michael; Gardner, Richard
2017-06-01
Existing evidence is limited regarding the influence of task design on performance and ergonomic risk, or the association between these two outcomes. In a controlled experiment, we constructed a mock fuselage to simulate a drilling task common in aircraft manufacturing, and examined the effect of three levels of workstation adjustability on performance as measured by productivity (e.g. fuselage completion time) and quality (e.g. fuselage defective holes), and ergonomic risk as quantified using two common methods (rapid upper limb assessment and the strain index). The primary finding was that both productivity and quality significantly improved with increased adjustability, yet this occurred only when that adjustability succeeded in reducing ergonomic risk. Supporting the inverse association between ergonomic risk and performance, the condition with highest adjustability created the lowest ergonomic risk and the best performance while there was not a substantial difference in ergonomic risk between the other two conditions, in which performance was also comparable. Practitioner Summary: Findings of this study supported a causal relationship between task design and both ergonomic risk and performance, and that ergonomic risk and performance are inversely associated. While future work is needed under more realistic conditions and a broader population, these results may be useful for task (re)design and to help cost-justify some ergonomic interventions.
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Safety evaluation model of urban cross-river tunnel based on driving simulation.
Ma, Yingqi; Lu, Linjun; Lu, Jian John
2017-09-01
Currently, Shanghai urban cross-river tunnels have three principal characteristics: increased traffic, a high accident rate and rapidly developing construction. Because of their complex geographic and hydrological characteristics, the alignment conditions in urban cross-river tunnels are more complicated than in highway tunnels, so a safety evaluation of urban cross-river tunnels is necessary to suggest follow-up construction and changes in operational management. A driving risk index (DRI) for urban cross-river tunnels was proposed in this study. An index system was also constructed, combining eight factors derived from the output of a driving simulator regarding three aspects of risk due to following, lateral accidents and driver workload. Analytic hierarchy process methods and expert marking and normalization processing were applied to construct a mathematical model for the DRI. The driving simulator was used to simulate 12 Shanghai urban cross-river tunnels and a relationship was obtained between the DRI for the tunnels and the corresponding accident rate (AR) via a regression analysis. The regression analysis results showed that the relationship between the DRI and the AR mapped to an exponential function with a high degree of fit. In the absence of detailed accident data, a safety evaluation model based on factors derived from a driving simulation can effectively assess the driving risk in urban cross-river tunnels constructed or in design.
78 FR 3328 - Fluroxypyr; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
... drinking water and in residential settings, but does not include occupational exposure. Section 408(b)(2)(C... from drinking water. The Agency used screening level water exposure models in the dietary exposure analysis and risk assessment for fluroxypyr in drinking water. These simulation models take into account...
The utility of the AusEd driving simulator in the clinical assessment of driver fatigue.
Desai, Anup V; Wilsmore, Brad; Bartlett, Delwyn J; Unger, Gunnar; Constable, Ben; Joffe, David; Grunstein, Ronald R
2007-08-01
Several driving simulators have been developed which range in complexity from PC based driving tasks to advanced "real world" simulators. The AusEd driving simulator is a PC based task, which was designed to be conducive to and test for driver fatigue. This paper describes the AusEd driving simulator in detail, including the technical requirements, hardware, screen and file outputs, and analysis software. Some aspects of the test are standardized, while others can be modified to suit the experimental situation. The AusEd driving simulator is sensitive to performance decrement from driver fatigue in the laboratory setting, potentially making it useful as a laboratory or office based test for driver fatigue risk management. However, more research is still needed to correlate laboratory based simulator performance with real world driving performance and outcomes.
Virtual Simulations: A Creative, Evidence-Based Approach to Develop and Educate Nurses.
Leibold, Nancyruth; Schwarz, Laura
2017-02-01
The use of virtual simulations in nursing is an innovative strategy that is increasing in application. There are several terms related to virtual simulation; although some are used interchangeably, the meanings are not the same. This article presents examples of virtual simulation, virtual worlds, and virtual patients in continuing education, staff development, and academic nursing education. Virtual simulations in nursing use technology to provide safe, as realistic as possible clinical practice for nurses and nursing students. Virtual simulations are useful for learning new skills; practicing a skill that puts content, high-order thinking, and psychomotor elements together; skill competency learning; and assessment for low-volume, high-risk skills. The purpose of this article is to describe the related terms, examples, uses, theoretical frameworks, challenges, and evidence related to virtual simulations in nursing.
Probabilistic simulation of uncertainties in thermal structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael
1990-01-01
Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
GERMcode: A Stochastic Model for Space Radiation Risk Assessment
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.
2012-01-01
A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal aberrations, and representative mouse tumor induction curves are implemented into the GERMcode. Extension of these descriptions to other endpoints related to non-targeted effects and biochemical pathway responses will be discussed.
Analysis of flood modeling through innovative geomatic methods
NASA Astrophysics Data System (ADS)
Zazo, Santiago; Molina, José-Luis; Rodríguez-Gonzálvez, Pablo
2015-05-01
A suitable assessment and management of the exposure level to natural flood risks necessarily requires an exhaustive knowledge of the terrain. This study, primarily aimed to evaluate flood risk, firstly assesses the suitability of an innovative technique, called Reduced Cost Aerial Precision Photogrammetry (RC-APP), based on a motorized technology ultra-light aircraft ULM (Ultra-Light Motor), together with the hybridization of reduced costs sensors, for the acquisition of geospatial information. Consequently, this research generates the RC-APP technique which is found to be a more accurate-precise, economical and less time consuming geomatic product. This technique is applied in river engineering for the geometric modeling and risk assessment to floods. Through the application of RC-APP, a high spatial resolution image (orthophoto of 2.5 cm), and a Digital Elevation Model (DEM) of 0.10 m mesh size and high density points (about 100 points/m2), with altimetric accuracy of -0.02 ± 0.03 m have been obtained. These products have provided a detailed knowledge of the terrain, afterward used for the hydraulic simulation which has allowed a better definition of the inundated area, with important implications for flood risk assessment and management. In this sense, it should be noted that the achieved spatial resolution of DEM is 0.10 m which is especially interesting and useful in hydraulic simulations through 2D software. According to the results, the developed methodology and technology allows for a more accurate riverbed representation, compared with other traditional techniques such as Light Detection and Ranging (LiDAR), with a Root-Mean-Square Error (RMSE ± 0.50 m). This comparison has revealed that RC-APP has one lower magnitude order of error than the LiDAR method. Consequently, this technique arises as an efficient and appropriate tool, especially in areas with high exposure to risk of flooding. In hydraulic terms, the degree of detail achieved in the 3D model, has allowed reaching a significant increase in the knowledge of hydraulic variables in natural waterways.
Alawieh, Ali; Sabra, Zahraa; Langley, E Farris; Bizri, Abdul Rahman; Hamadeh, Randa; Zaraket, Fadi A
2017-11-25
After the re-introduction of poliovirus to Syria in 2013, Lebanon was considered at high transmission risk due to its proximity to Syria and the high number of Syrian refugees. However, after a large-scale national immunization initiative, Lebanon was able to prevent a potential outbreak of polio among nationals and refugees. In this work, we used a computational individual-simulation model to assess the risk of poliovirus threat to Lebanon prior and after the immunization campaign and to quantitatively assess the healthcare impact of the campaign and the required standards that need to be maintained nationally to prevent a future outbreak. Acute poliomyelitis surveillance in Lebanon was along with the design and coverage rate of the recent national polio immunization campaign were reviewed from the records of the Lebanese Ministry of Public Health. Lebanese population demographics including Syrian and Palestinian refugees were reviewed to design individual-based models that predicts the consequences of polio spread to Lebanon and evaluate the outcome of immunization campaigns. The model takes into account geographic, demographic and health-related features. Our simulations confirmed the high risk of polio outbreaks in Lebanon within 10 days of case introduction prior to the immunization campaign, and showed that the current immunization campaign significantly reduced the speed of the infection in the event poliomyelitis cases enter the country. A minimum of 90% national immunization coverage was found to be required to prevent exponential propagation of potential transmission. Both surveillance and immunization efforts should be maintained at high standards in Lebanon and other countries in the area to detect and limit any potential outbreak. The use of computational population simulation models can provide a quantitative approach to assess the impact of immunization campaigns and the burden of infectious diseases even in the context of population migration.
NASA Technical Reports Server (NTRS)
Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.
2015-01-01
The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.
The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks
NASA Technical Reports Server (NTRS)
Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.
2015-01-01
The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.
Eccel, Emanuele; Rea, Roberto; Caffarra, Amelia; Crisci, Alfonso
2009-05-01
In the context of global warming, the general trend towards earlier flowering dates of many temperate tree species is likely to result in an increased risk of damage from exposure to frost. To test this hypothesis, a phenological model of apple flowering was applied to a temperature series from two locations in an important area for apple production in Europe (Trentino, Italy). Two simulated 50-year climatic projections (A2 and B2 of the Intergovernmental Panel on Climate Change--Special Report on Emission Scenarios) from the HadCM3 general circulation model were statistically downscaled to the two sites. Hourly temperature records over a 40-year period were used as the reference for past climate. In the phenological model, the heat requirement (degree hours) for flowering was parameterized using two approaches; static (constant over time) and dynamic (climate dependent). Parameterisation took into account the trees' adaptation to changing temperatures based on either past instrumental records or the downscaled outputs from the climatic simulations. Flowering dates for the past 40 years and simulated flowering dates for the next 50 years were used in the model. A significant trend towards earlier flowering was clearly detected in the past. This negative trend was also apparent in the simulated data. However, the significance was less apparent when the "dynamic" setting for the degree hours requirement was used in the model. The number of frost episodes and flowering dates, on an annual basis, were graphed to assess the risk of spring frost. Risk analysis confirmed a lower risk of exposure to frost at present than in the past, and probably either constant or a slightly lower risk in future, especially given that physiological processes are expected to acclimate to higher temperatures.
Stress testing hydrologic models using bottom-up climate change assessment
NASA Astrophysics Data System (ADS)
Stephens, C.; Johnson, F.; Marshall, L. A.
2017-12-01
Bottom-up climate change assessment is a promising approach for understanding the vulnerability of a system to potential future changes. The technique has been utilised successfully in risk-based assessments of future flood severity and infrastructure vulnerability. We find that it is also an ideal tool for assessing hydrologic model performance in a changing climate. In this study, we applied bottom-up climate change to compare the performance of two different hydrologic models (an event-based and a continuous model) under increasingly severe climate change scenarios. This allowed us to diagnose likely sources of future prediction error in the two models. The climate change scenarios were based on projections for southern Australia, which indicate drier average conditions with increased extreme rainfall intensities. We found that the key weakness in using the event-based model to simulate drier future scenarios was the model's inability to dynamically account for changing antecedent conditions. This led to increased variability in model performance relative to the continuous model, which automatically accounts for the wetness of a catchment through dynamic simulation of water storages. When considering more intense future rainfall events, representation of antecedent conditions became less important than assumptions around (non)linearity in catchment response. The linear continuous model we applied may underestimate flood risk in a future climate with greater extreme rainfall intensity. In contrast with the recommendations of previous studies, this indicates that continuous simulation is not necessarily the key to robust flood modelling under climate change. By applying bottom-up climate change assessment, we were able to understand systematic changes in relative model performance under changing conditions and deduce likely sources of prediction error in the two models.
Prediction of Coronary Artery Disease Risk Based on Multiple Longitudinal Biomarkers
Yang, Lili; Yu, Menggang; Gao, Sujuan
2016-01-01
In the last decade, few topics in the area of cardiovascular disease (CVD) research have received as much attention as risk prediction. One of the well documented risk factors for CVD is high blood pressure (BP). Traditional CVD risk prediction models consider BP levels measured at a single time and such models form the basis for current clinical guidelines for CVD prevention. However, in clinical practice, BP levels are often observed and recorded in a longitudinal fashion. Information on BP trajectories can be powerful predictors for CVD events. We consider joint modeling of time to coronary artery disease and individual longitudinal measures of systolic and diastolic BPs in a primary care cohort with up to 20 years of follow-up. We applied novel prediction metrics to assess the predictive performance of joint models. Predictive performances of proposed joint models and other models were assessed via simulations and illustrated using the primary care cohort. PMID:26439685
A novel approach to simulate gene-environment interactions in complex diseases.
Amato, Roberto; Pinelli, Michele; D'Andrea, Daniel; Miele, Gennaro; Nicodemi, Mario; Raiconi, Giancarlo; Cocozza, Sergio
2010-01-05
Complex diseases are multifactorial traits caused by both genetic and environmental factors. They represent the major part of human diseases and include those with largest prevalence and mortality (cancer, heart disease, obesity, etc.). Despite a large amount of information that has been collected about both genetic and environmental risk factors, there are few examples of studies on their interactions in epidemiological literature. One reason can be the incomplete knowledge of the power of statistical methods designed to search for risk factors and their interactions in these data sets. An improvement in this direction would lead to a better understanding and description of gene-environment interactions. To this aim, a possible strategy is to challenge the different statistical methods against data sets where the underlying phenomenon is completely known and fully controllable, for example simulated ones. We present a mathematical approach that models gene-environment interactions. By this method it is possible to generate simulated populations having gene-environment interactions of any form, involving any number of genetic and environmental factors and also allowing non-linear interactions as epistasis. In particular, we implemented a simple version of this model in a Gene-Environment iNteraction Simulator (GENS), a tool designed to simulate case-control data sets where a one gene-one environment interaction influences the disease risk. The main aim has been to allow the input of population characteristics by using standard epidemiological measures and to implement constraints to make the simulator behaviour biologically meaningful. By the multi-logistic model implemented in GENS it is possible to simulate case-control samples of complex disease where gene-environment interactions influence the disease risk. The user has full control of the main characteristics of the simulated population and a Monte Carlo process allows random variability. A knowledge-based approach reduces the complexity of the mathematical model by using reasonable biological constraints and makes the simulation more understandable in biological terms. Simulated data sets can be used for the assessment of novel statistical methods or for the evaluation of the statistical power when designing a study.
Increased Cognitive Load Leads to Impaired Mobility Decisions in Seniors at Risk for Falls
Nagamatsu, Lindsay S.; Voss, Michelle; Neider, Mark B.; Gaspar, John G.; Handy, Todd C.; Kramer, Arthur F.; Liu-Ambrose, Teresa Y. L.
2011-01-01
Successful mobility requires appropriate decision-making. Seniors with reduced executive functioning— such as senior fallers—may be prone to poor mobility judgments, especially under dual-task conditions. We classified participants as “At-Risk” and “Not-At-Risk” for falls using a validated physiological falls-risk assessment. Dual-task performance was assessed in a virtual reality environment where participants crossed a simulated street by walking on a manual treadmill while listening to music or conversing on a phone. Those “At-Risk” experienced more collisions with oncoming cars and had longer crossing times in the Phone condition compared to controls. We conclude that poor mobility judgments during a dual-task leads to unsafe mobility for those at-risk for falls. PMID:21463063
NASA Astrophysics Data System (ADS)
Hartmann, Andreas; Jasechko, Scott; Gleeson, Tom; Wada, Yoshihide; Andreo, Bartolomé; Barberá, Juan Antonio; Brielmann, Heike; Charlier, Jean-Baptiste; Darling, George; Filippini, Maria; Garvelmann, Jakob; Goldscheider, Nico; Kralik, Martin; Kunstmann, Harald; Ladouche, Bernard; Lange, Jens; Mudarra, Matías; Francisco Martín, José; Rimmer, Alon; Sanchez, Damián; Stumpp, Christine; Wagener, Thorsten
2017-04-01
Karst develops through the dissolution of carbonate rock and results in pronounced spatiotemporal heterogeneity of hydrological processes. Karst groundwater in Europe is a major source of fresh water contributing up to half of the total drinking water supply in some countries like Austria or Slovenia. Previous work showed that karstic recharge processes enhance and alter the sensitivity of recharge to climate variability. The enhanced preferential flow from the surface to the aquifer may be followed by enhanced risk of groundwater contamination. In this study we assess the contamination risk of karst aquifers over Europe and the Mediterranean using simulated transit time distributions. Using a new type of semi-distributed model that considers the spatial heterogeneity of karst hydraulic properties, we were able to simulate karstic groundwater recharge including its heterogeneous spatiotemporal dynamics. The model is driven by gridded daily climate data from the Global Land Data Assimilation System (GLDAS). Transit time distributions are calculated using virtual tracer experiments. We evaluated our simulations by independent information on transit times derived from observed time series of water isotopes of >70 karst springs over Europe. The simulations indicate that, compared to humid, mountain and desert regions, the Mediterranean region shows a stronger risk of contamination in Europe because preferential flow processes are most pronounced given thin soil layers and the seasonal abundance of high intensity rainfall events in autumn and winter. Our modelling approach includes strong simplifications and its results cannot easily be generalized but it still highlights that the combined effects of variable climate and heterogeneous catchment properties constitute a strong risk on water quality.
Non-Invasive Assessment of Susceptibility to Ventricular Arrhythmias During Simulated Microgravity
NASA Technical Reports Server (NTRS)
Cohen, Richard J.
1999-01-01
The Cardiovascular Alterations Team is currently conducting studies to determine what alterations in hemodynamic regulation result from sixteen days of simulated microgravity exposure in normal human subjects. In this project we make additional measurements on these same study subjects in order to determine whether there is an increase in susceptibility to ventricular arrhythmias resulting from simulated microgravity exposure. Numerous anecdotal and documented reports from the past 30 years suggest that the incidence of ventricular arrhythmias among astronauts is increased during space flight. For example, documented runs of ventricular tachycardia have been recorded from crew members of Skylab and Mir, there was much attention given by the lay press to Mir Commander Vasily Tslbliyev's complaints of heart rhythm irregularities in July of 1997, and cardiovascular mechanisms may have been causal in the recent death of an experimental primate shortly after return from space. In 1986, a Mir cosmonaut, Alexander Laveikin, was brought home and replaced with an alternate cosmonaut as a result of cardiac dysrhythmias that began during extravehicular activity. Furthermore, at a joint NASA/NSBRI workshop held in January 1998, cardiac arrhythmias were identified as the highest priority cardiovascular risk to a human Mars mission. Despite the evidence for the risk of a potentially lethal arrhythmia resulting from microgravity exposure, the effects of space flight and the associated physiologic stresses on cardiac conduction processes are not known, and an increase in cardiac susceptibility to arrhythmias has never been quantified. In this project, we are determining whether simulated space flight increases the risk of developing life-threatening heart rhythm disturbances such as sustained ventricular tachycardia (defined as ventricular tachycardia lasting at least 30 seconds or resulting in hemodynamic collapse) and ventricular fibrillation. We are obtaining measures of cardiac susceptibility to ventricular arrhythmias in subjects exposed to simulated space flight in the Human Studies Core protocol being conducted by the Cardiovascular Alterations Team, which involves sixteen days .of bed rest. In particular, we are applying a powerful new non-invasive technology, developed in Professor Cohen's laboratory at MIT for the quantitative assessment of the risk of life-threatening ventricular arrhythmias. This technology involves the measurement of microvolt levels of T wave alternans (TWA) during exercise stress, and was recently granted approval by the Food and Drug Administration to be used for the clinical evaluation of patients suspected to be at risk of ventricular arrhythmias. In addition, we are obtaining 24 hour Holter monitoring (to detect non-sustained ventricular tachycardia and to assess heart rate variability). We are also conducting protocols to obtain these same measures on a monthly basis for up to four months in subjects in the Bone Demineralization/calcium Metaboloism Team's long term bed rest study.
Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester
There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing pastmore » benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less
Smadi, Hanan; Sargeant, Jan M
2013-02-01
The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail-to-table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross-contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research. © 2012 Society for Risk Analysis.
Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Régis; Ruthman, Todd; Wilson, Margaret
2013-03-01
Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.
Foster, Adriana; Chaudhary, Neelam; Murphy, James; Lok, Benjamin; Waller, Jennifer; Buckley, Peter F
2015-12-01
There is increasing use of educational technologies in medical and surgical specialties. Described herein is the development and application of an interactive virtual patient (VP) to teach suicide risk assessment to health profession trainees. We studied the effect of the following: (1) an interaction with a bipolar VP who attempts suicide or (2) completion of a video-teaching module on interviewing a bipolar patient, on medical students' proficiency in assessing suicide risk in standardized patients. We hypothesized that students who interact with a bipolar VP will be at least as likely to assess suicide risk, as their peers who completed a video module. In a randomized, controlled study, we compared the frequency with which second-year students at the Medical College of Georgia asked suicide risk and bipolar symptoms questions by VP/video group. We recruited 67 students. The VP group inquired more frequently than the video group in 4 of 5 suicide risk areas and 11 of 14 other bipolar symptomatology areas. There were minimal to small effect sizes in favor of the VP technology. The students preferred the video over the VP as an educational tool (p = 0.007). Our study provides proof of concept that both VP and video module approaches are feasible for teaching students to assess suicide risk, and we present evidence about the role of active learning to improve communication skills. Depending on the learning context, interviewing a VP or observation of a videotaped interview can enhance the students' suicide risk assessment proficiency in an interview with a standardized patient. An interactive VP is a plausible modality to deliver basic concepts of suicide risk assessment to medical students, can facilitate individual preferences by providing easy access and portability, and has potential generalizability to other aspects of psychiatric training.
Katapodi, Maria C; Dodd, Marylin J; Facione, Noreen C; Humphreys, Janice C; Lee, Kathryn A
2010-01-01
Perceived risk to a health problem is formed by inferential rules called heuristics and by comparative judgments that assess how one's risk compares to the risk of others. The purpose of this cross-sectional, community-based survey was to examine how experiences with breast cancer, knowledge of risk factors, and specific heuristics inform risk judgments for oneself, for friends/peers, and comparative judgments for breast cancer (risk friends/peers - risk self). We recruited an English-speaking, multicultural (57% nonwhite) sample of 184 middle-aged (47 + or - 12 years old), well-educated women. Fifty percent of participants perceived that their breast cancer risk was the same as the risk of their friends/peers; 10% were pessimistic (risk friends/peers - risk self < 0), whereas 40% were optimistic (risk friends/peers - risk self > 0). Family history of breast cancer and worry informed risk judgments for oneself. The availability and cultural heuristics specific for black women informed risk judgments for friends/peers. Knowledge of risk factors and interactions of knowledge with the availability, representativeness, and simulation heuristics informed comparative judgments (risk friends/peers - risk self). We discuss cognitive mechanisms with which experiences, knowledge, and heuristics influence comparative breast cancer risk judgments. Risk communication interventions should assess knowledge deficits, contextual variables, and specific heuristics that activate differential information processing mechanisms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco
Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less
Berner, Christine L; Staid, Andrea; Flage, Roger; Guikema, Seth D
2017-10-01
Recently, the concept of black swans has gained increased attention in the fields of risk assessment and risk management. Different types of black swans have been suggested, distinguishing between unknown unknowns (nothing in the past can convincingly point to its occurrence), unknown knowns (known to some, but not to relevant analysts), or known knowns where the probability of occurrence is judged as negligible. Traditional risk assessments have been questioned, as their standard probabilistic methods may not be capable of predicting or even identifying these rare and extreme events, thus creating a source of possible black swans. In this article, we show how a simulation model can be used to identify previously unknown potentially extreme events that if not identified and treated could occur as black swans. We show that by manipulating a verified and validated model used to predict the impacts of hazards on a system of interest, we can identify hazard conditions not previously experienced that could lead to impacts much larger than any previous level of impact. This makes these potential black swan events known and allows risk managers to more fully consider them. We demonstrate this method using a model developed to evaluate the effect of hurricanes on energy systems in the United States; we identify hurricanes with potentially extreme impacts, storms well beyond what the historic record suggests is possible in terms of impacts. © 2016 Society for Risk Analysis.
Robustness of assembly supply chain networks by considering risk propagation and cascading failure
NASA Astrophysics Data System (ADS)
Tang, Liang; Jing, Ke; He, Jie; Stanley, H. Eugene
2016-10-01
An assembly supply chain network (ASCN) is composed of manufacturers located in different geographical regions. To analyze the robustness of this ASCN when it suffers from catastrophe disruption events, we construct a cascading failure model of risk propagation. In our model, different disruption scenarios s are considered and the probability equation of all disruption scenarios is developed. Using production capability loss as the robustness index (RI) of an ASCN, we conduct a numerical simulation to assess its robustness. Through simulation, we compare the network robustness at different values of linking intensity and node threshold and find that weak linking intensity or high node threshold increases the robustness of the ASCN. We also compare network robustness levels under different disruption scenarios.
Lai, Jyh-Mirn; Hwang, Yi-Ting; Chou, Chin-Cheng
2012-06-01
The highly pathogenic avian influenza virus (HPAIV) is able to survive in poultry products and could be carried into a country by air travelers. An assessment model was constructed to estimate the probability of the exotic viable HPAIV entering Taiwan from two neighboring areas through poultry products carried illegally by air passengers at Taiwan's main airports. The entrance risk was evaluated based on HPAIV-related factors (the prevalence and the incubation period of HPAIV; the manufacturing process of poultry products; and the distribution-storage-transportation factor event) and the passenger event. Distribution functions were adopted to simulate the probabilities of each HPAIV factor. The odds of passengers being intercepted with illegal poultry products were estimated by logistic regression. The Monte Carlo simulation established that the risk caused by HPAIV-related factors from area A was lower than area B, whereas the entrance risk by the passenger event from area A was similar to area B. Sensitivity analysis showed that the incubation period of HPAIV and the interception of passenger violations were major determinants. Although the result showed viable HPAIV was unlikely to enter Taiwan through meat illegally carried by air passengers, this low probability could be caused by incomplete animal disease data and modeling uncertainties. Considering the negative socioeconomic impacts of HPAIV outbreaks, strengthening airport quarantine measures is still necessary. This assessment provides a profile of HPAIV entrance risk through air travelers arriving from endemic areas and a feasible direction for quarantine and public health measures. © 2011 Society for Risk Analysis.
Life history and spatial traits predict extinction risk due to climate change
NASA Astrophysics Data System (ADS)
Pearson, Richard G.; Stanton, Jessica C.; Shoemaker, Kevin T.; Aiello-Lammens, Matthew E.; Ersts, Peter J.; Horning, Ned; Fordham, Damien A.; Raxworthy, Christopher J.; Ryu, Hae Yeong; McNees, Jason; Akçakaya, H. Reşit
2014-03-01
There is an urgent need to develop effective vulnerability assessments for evaluating the conservation status of species in a changing climate. Several new assessment approaches have been proposed for evaluating the vulnerability of species to climate change based on the expectation that established assessments such as the IUCN Red List need revising or superseding in light of the threat that climate change brings. However, although previous studies have identified ecological and life history attributes that characterize declining species or those listed as threatened, no study so far has undertaken a quantitative analysis of the attributes that cause species to be at high risk of extinction specifically due to climate change. We developed a simulation approach based on generic life history types to show here that extinction risk due to climate change can be predicted using a mixture of spatial and demographic variables that can be measured in the present day without the need for complex forecasting models. Most of the variables we found to be important for predicting extinction risk, including occupied area and population size, are already used in species conservation assessments, indicating that present systems may be better able to identify species vulnerable to climate change than previously thought. Therefore, although climate change brings many new conservation challenges, we find that it may not be fundamentally different from other threats in terms of assessing extinction risks.
77 FR 26954 - 1-Naphthaleneacetic acid; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-08
... for which there is reliable information.'' This includes exposure through drinking water and in... exposure from drinking water. The Agency used screening level water exposure models in the dietary exposure analysis and risk assessment for NAA in drinking water. These simulation models take into account data on...
78 FR 29049 - Streptomycin; Pesticide Tolerances for Emergency Exemptions
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-17
... exposures for which there is reliable information.'' This includes exposure through drinking water and in... commodities. 2. Dietary exposure from drinking water. The Agency used screening level water exposure models in the dietary exposure analysis and risk assessment for streptomycin in drinking water. These simulation...
United States Marine Corps Motor Transport Mechanic-to-Equipment Ratio
time motor transport equipment remains in maintenance at the organizational command level. This thesis uses a discrete event simulation model of the...applied to a single experiment that allows for assessment of risk of not achieving the objective. Inter-arrival time, processing time, work schedule
Estimates of radiological risk from depleted uranium weapons in war scenarios.
Durante, Marco; Pugliese, Mariagabriella
2002-01-01
Several weapons used during the recent conflict in Yugoslavia contain depleted uranium, including missiles and armor-piercing incendiary rounds. Health concern is related to the use of these weapons, because of the heavy-metal toxicity and radioactivity of uranium. Although chemical toxicity is considered the more important source of health risk related to uranium, radiation exposure has been allegedly related to cancers among veterans of the Balkan conflict, and uranium munitions are a possible source of contamination in the environment. Actual measurements of radioactive contamination are needed to assess the risk. In this paper, a computer simulation is proposed to estimate radiological risk related to different exposure scenarios. Dose caused by inhalation of radioactive aerosols and ground contamination induced by Tomahawk missile impact are simulated using a Gaussian plume model (HOTSPOT code). Environmental contamination and committed dose to the population resident in contaminated areas are predicted by a food-web model (RESRAD code). Small values of committed effective dose equivalent appear to be associated with missile impacts (50-y CEDE < 5 mSv), or population exposure by water-independent pathways (50-y CEDE < 80 mSv). The greatest hazard is related to the water contamination in conditions of effective leaching of uranium in the groundwater (50-y CEDE < 400 mSv). Even in this worst case scenario, the chemical toxicity largely predominates over radiological risk. These computer simulations suggest that little radiological risk is associated to the use of depleted uranium weapons.
Simulated ward round: reducing costs, not outcomes.
Ford, Helen; Cleland, Jennifer; Thomas, Ian
2017-02-01
Distractions and interruptions on the ward pose substantial patient safety risks, but medical students receive little training on their management. Although there is some evidence that medical students can be taught how to manage distractions and interruptions in a simulated ward environment, the only model to date is based on individual feedback, which is resource-expensive, mitigating curricular integration. Our aim was to assess the educational utility of a cost-efficient approach to a patient safety-focused simulated ward round. Twenty-three of 55 final-year medical students took part in a cost-reduced simulated ward round. Costs were minimised by providing group rather than individualised feedback, thereby shortening the duration of each simulation and reducing the number of interruptions. The utility of the simulation was assessed via student evaluation and performance on a patient safety station of an objective structured clinical examination (OSCE). The direct costs of the simulation were more than 50 per cent lower per student compared with the original study, mostly as a result of a reduction in the time that faculty members took to give feedback. Students managed distractions better and received higher scores in the OSCE station than those who had not undergone the ward round. Group feedback was evaluated positively by most participants: 94 per cent of those who provided feedback agreed or strongly agreed that the simulation would make them a safer doctor and would improve their handling of distractions. Our aim was to assess the educational utility of a cost-efficient approach to a patient safety-focused simulated ward round DISCUSSION: The costs of a simulated ward round can be significantly reduced whilst maintaining educational utility. These findings should encourage medical schools to integrate ward simulation into curricula. © 2016 John Wiley & Sons Ltd.
Humphries Choptiany, John Michael; Pelot, Ronald
2014-09-01
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.
Carbon emissions risk map from deforestation in the tropical Amazon
NASA Astrophysics Data System (ADS)
Ometto, J.; Soler, L. S.; Assis, T. D.; Oliveira, P. V.; Aguiar, A. P.
2011-12-01
Assis, Pedro Valle This work aims to estimate the carbon emissions from tropical deforestation in the Brazilian Amazon associated to the risk assessment of future land use change. The emissions are estimated by incorporating temporal deforestation dynamics, accounting for the biophysical and socioeconomic heterogeneity in the region, as well secondary forest growth dynamic in abandoned areas. The land cover change model that supported the risk assessment of deforestation, was run based on linear regressions. This method takes into account spatial heterogeneity of deforestation as the spatial variables adopted to fit the final regression model comprise: environmental aspects, economic attractiveness, accessibility and land tenure structure. After fitting a suitable regression models for each land cover category, the potential of each cell to be deforested (25x25km and 5x5 km of resolution) in the near future was used to calculate the risk assessment of land cover change. The carbon emissions model combines high-resolution new forest clear-cut mapping and four alternative sources of spatial information on biomass distribution for different vegetation types. The risk assessment map of CO2 emissions, was obtained by crossing the simulation results of the historical land cover changes to a map of aboveground biomass contained in the remaining forest. This final map represents the risk of CO2 emissions at 25x25km and 5x5 km until 2020, under a scenario of carbon emission reduction target.
Pouillot, Régis; Delignette-Muller, Marie Laure
2010-09-01
Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.
Comparing predictions of extinction risk using models and subjective judgement
NASA Astrophysics Data System (ADS)
McCarthy, Michael A.; Keith, David; Tietjen, Justine; Burgman, Mark A.; Maunder, Mark; Master, Larry; Brook, Barry W.; Mace, Georgina; Possingham, Hugh P.; Medellin, Rodrigo; Andelman, Sandy; Regan, Helen; Regan, Tracey; Ruckelshaus, Mary
2004-10-01
Models of population dynamics are commonly used to predict risks in ecology, particularly risks of population decline. There is often considerable uncertainty associated with these predictions. However, alternatives to predictions based on population models have not been assessed. We used simulation models of hypothetical species to generate the kinds of data that might typically be available to ecologists and then invited other researchers to predict risks of population declines using these data. The accuracy of the predictions was assessed by comparison with the forecasts of the original model. The researchers used either population models or subjective judgement to make their predictions. Predictions made using models were only slightly more accurate than subjective judgements of risk. However, predictions using models tended to be unbiased, while subjective judgements were biased towards over-estimation. Psychology literature suggests that the bias of subjective judgements is likely to vary somewhat unpredictably among people, depending on their stake in the outcome. This will make subjective predictions more uncertain and less transparent than those based on models.
Geng, Menghan; Qi, Hongjuan; Liu, Xuelin; Gao, Bo; Yang, Zhan; Lu, Wei; Sun, Rubao
2016-05-01
The potential contaminations of 16 trace elements (Cr, Mn, Ni, Cu, Zn, As, Cd, Sb, Ba, Pb, Co, Be, V, Ti, Tl, Al) in drinking water collected in two remote areas in China were analyzed. The average levels of the trace elements were lower than the allowable concentrations set by national agencies, except for several elements (As, Sb, Mn, and Be) in individual samples. A health risk assessment model was conducted and carcinogenic and non-carcinogenic risks were evaluated separately. The results indicated that the total carcinogenic risks were higher than the maximum allowed risk level set by most organizations (1 × 10(-6)). Residents in both study areas were at risk of carcinogenic effects from exposure to Cr, which accounted for 80-90 % of the total carcinogenic risks. The non-carcinogenic risks (Cu, Zn, Ni) were lower than the maximum allowance levels. Among the four population groups, infants incurred the highest health risks and required special attention. Correlation analysis revealed significant positive associations among most trace elements, indicating the likelihood of a common source. The results of probabilistic health risk assessment of Cr based on Monte-Carlo simulation revealed that the uncertainty of system parameters does not affect the decision making of pollution prevention and control. Sensitivity analysis revealed that ingestion rate of water and concentration of Cr showed relatively high sensitivity to the health risks.
Toxicological Risks During Human Space Exploration
NASA Technical Reports Server (NTRS)
James, John T.; Limero, T. F.; Lam, C. W.; Billica, Roger (Technical Monitor)
2000-01-01
The goal of toxicological risk assessment of human space flight is to identify and quantify significant risks to astronaut health from air pollution inside the vehicle or habitat, and to develop a strategy for control of those risks. The approach to completing a toxicological risk assessment involves data and experience on the frequency and severity of toxicological incidents that have occurred during space flight. Control of these incidents depends on being able to understand their cause from in-flight and ground-based analysis of air samples, crew reports of air quality, and known failures in containment of toxic chemicals. Toxicological risk assessment in exploration missions must be based on an evaluation of the unique toxic hazards presented by the habitat location. For example, lunar and Martian dust must be toxicologically evaluated to determine the appropriate control measures for exploration missions. Experience with near-earth flights has shown that the toxic products from fires present the highest risk to crew health from air pollution. Systems and payload leaks also present a significant hazard. The health risk from toxicity associated with materials offgassing or accumulation of human metabolites is generally well controlled. Early tests of lunar and Martian dust simulants have shown that each posses the potential to cause fibrosis in the lung in a murine model. Toxicological risks from air pollutants in space habitats originate from many sources. A number of risks have been identified through near-earth operations; however, the evaluation of additional new risks present during exploration missions will be a challenge.
Pouillot, Régis; Gallagher, Daniel; Tang, Jia; Hoelzer, Karin; Kause, Janell; Dennis, Sherri B
2015-01-01
The Interagency Risk Assessment-Listeria monocytogenes (Lm) in Retail Delicatessens provides a scientific assessment of the risk of listeriosis associated with the consumption of ready-to-eat (RTE) foods commonly prepared and sold in the delicatessen (deli) of a retail food store. The quantitative risk assessment (QRA) model simulates the behavior of retail employees in a deli department and tracks the Lm potentially present in this environment and in the food. Bacterial growth, bacterial inactivation (following washing and sanitizing actions), and cross-contamination (from object to object, from food to object, or from object to food) are evaluated through a discrete event modeling approach. The QRA evaluates the risk per serving of deli-prepared RTE food for the susceptible and general population, using a dose-response model from the literature. This QRA considers six separate retail baseline conditions and provides information on the predicted risk of listeriosis for each. Among the baseline conditions considered, the model predicts that (i) retail delis without an environmental source of Lm (such as niches), retail delis without niches that do apply temperature control, and retail delis with niches that do apply temperature control lead to lower predicted risk of listeriosis relative to retail delis with niches and (ii) retail delis with incoming RTE foods that are contaminated with Lm lead to higher predicted risk of listeriosis, directly or through cross-contamination, whether the contaminated incoming product supports growth or not. The risk assessment predicts that listeriosis cases associated with retail delicatessens result from a sequence of key events: (i) the contaminated RTE food supports Lm growth; (ii) improper retail and/or consumer storage temperature or handling results in the growth of Lm on the RTE food; and (iii) the consumer of this RTE food is susceptible to listeriosis. The risk assessment model, therefore, predicts that cross-contamination with Lm at retail predominantly results in sporadic cases.
Flores-Alsina, Xavier; Comas, Joaquim; Rodriguez-Roda, Ignasi; Gernaey, Krist V; Rosen, Christian
2009-10-01
The main objective of this paper is to demonstrate how including the occurrence of filamentous bulking sludge in a secondary clarifier model will affect the predicted process performance during the simulation of WWTPs. The IWA Benchmark Simulation Model No. 2 (BSM2) is hereby used as a simulation case study. Practically, the proposed approach includes a risk assessment model based on a knowledge-based decision tree to detect favourable conditions for the development of filamentous bulking sludge. Once such conditions are detected, the settling characteristics of the secondary clarifier model are automatically changed during the simulation by modifying the settling model parameters to mimic the effect of growth of filamentous bacteria. The simulation results demonstrate that including effects of filamentous bulking in the secondary clarifier model results in a more realistic plant performance. Particularly, during the periods when the conditions for the development of filamentous bulking sludge are favourable--leading to poor activated sludge compaction, low return and waste TSS concentrations and difficulties in maintaining the biomass in the aeration basins--a subsequent reduction in overall pollution removal efficiency is observed. Also, a scenario analysis is conducted to examine i) the influence of sludge retention time (SRT), the external recirculation flow rate (Q(r)) and the air flow rate in the bioreactor (modelled as k(L)a) as factors promoting bulking sludge, and ii) the effect on the model predictions when the settling properties are changed due to a possible proliferation of filamentous microorganisms. Finally, the potentially adverse effects of certain operational procedures are highlighted, since such effects are normally not considered by state-of-the-art models that do not include microbiology-related solids separation problems.
Simulated distribution and ecotoxicity-based assessment of chemically-dispersed oil in Tokyo Bay.
Koyama, Jiro; Imakado, Chie; Uno, Seiichi; Kuroda, Takako; Hara, Shouichi; Majima, Takahiro; Shirota, Hideyuki; Añasco, Nathaniel C
2014-08-30
To assess risks of chemically-dispersed oil to marine organisms, oil concentrations in the water were simulated using a hypothetical spill accident in Tokyo Bay. Simulated oil concentrations were then compared with the short-term no-observed effect concentration (NOEC), 0.01 mg/L, obtained through toxicity tests using marine diatoms, amphipod and fish. Area of oil concentrations higher than the NOEC were compared with respect to use and non-use of dispersant. Results of the simulation show relatively faster dispersion near the mouth of the bay compared to its inner sections which is basically related to its stronger water currents. Interestingly, in the inner bay, a large area of chemically-dispersed oil has concentrations higher than the NOEC. It seems emulsifying oil by dispersant increases oil concentrations, which could lead to higher toxicity to aquatic organisms. When stronger winds occur, however, the difference in toxic areas between use and non-use of dispersant is quite small. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lien, F. S.; Ji, H.; Yee, E.
Early experimental work, conducted at Defence R&D Canada — Suffield, measured and characterized the personal and environmental contamination associated with the simulated opening of anthrax-tainted letters under a number of different scenarios. A better understanding of the physical and biological processes is considerably significant for detecting, assessing, and formulating potential mitigation strategies for managing these risks. These preliminary experimental investigations have been extended to simulate the contamination from the opening of anthrax-tainted letters in an Open-Office environment using Computational Fluid Dynamics (CFD). Bacillus globigii (BG) was used as a biological simulant for anthrax, with 0.1 gram of the simulant released from opened letters in the experiments conducted. The accuracy of the model for prediction of the spatial distribution of BG spores in the office is first assessed quantitatively by comparison with measured SF6 concentrations (the baseline experiment), and then qualitatively by comparison with measured BG concentrations obtained under a number of scenarios, some involving people moving within various offices.
Objective assessment of the effects of texting while driving: a simulator study.
Bendak, Salaheddine
2015-01-01
Recent advances in electronic communication technology led to many drivers opting to send and receive text messages while driving. This, inevitably, has a potential to distract drivers, impair driving performance and lead to crashes. This study aims to assess the risk involved in texting while driving through assessing the distraction caused and determining the change in key driving performance indicators. Twenty-one paid young male volunteers were recruited to participate in this study. Each participant drove a driving simulator on four different scenarios involving driving while texting and without texting on highways and town roads. Results showed that texting while driving led, on average, to five times more crashes than driving without texting. Due to distraction also, participants unnecessarily crossed lane boundaries and road boundaries more often while texting as compared to driving without texting. Moreover, distraction due to texting led to participants deviating their eyes off the road while texting 15 times per session, on average, more than without texting. Results demonstrated a high-risk level of distraction and clear impairment in drivers' ability to drive safely due to texting. Based on the results, practical recommendations to combat this phenomenon are given.
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Huang, Guo H.
2011-12-01
Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.
Probabilistic Assessment of Cancer Risk from Solar Particle Events
NASA Astrophysics Data System (ADS)
Kim, Myung-Hee Y.; Cucinotta, Francis A.
For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.
Probabilistic Assessment of Cancer Risk from Solar Particle Events
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Cucinotta, Francis A.
2010-01-01
For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.
Harwell, Mark A.; Gentile, John H.; Parker, Keith R.; Murphy, Stephen M.; Day, Robert H.; Bence, A. Edward; Neff, Jerry M.; Wiens, John A.
2012-01-01
Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001–2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400–4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680
Wheelchair pushing and turning: lumbar spine and shoulder loads and recommended limits.
Weston, Eric B; Khan, Safdar N; Marras, William S
2017-12-01
The objective of this study was to determine how simulated manual wheelchair pushing influences biomechanical loading to the lumbar spine and shoulders. Sixty-two subjects performed simulated wheelchair pushing and turning in a laboratory. An electromyography-assisted biomechanical model was used to estimate spinal loads. Moments at the shoulder joint, external hand forces and net turning torque were also assessed. Multiple linear regression techniques were employed to develop biomechanically based wheelchair pushing guidelines relating resultant hand force or net torque to spinal load. Male subjects experienced significantly greater spinal loading (p < 0.01), and spine loads were also increased for wheelchair turning compared to straight wheelchair pushing (p < 0.001). Biomechanically determined maximum acceptable resultant hand forces were 17-18% lower than psychophysically determined limits. We conclude that manual wheelchair pushing and turning can pose biomechanical risk to the lumbar spine and shoulders. Psychophysically determined maximum acceptable push forces do not appear to be protective enough of this biomechanical risk. Practitioner Summary: This laboratory study investigated biomechanical risk to the low back and shoulders during simulated wheelchair pushing. Manual wheelchair pushing posed biomechanical risk to the lumbar spine (in compression and A/P shear) and to the shoulders. Biomechanically determined wheelchair pushing thresholds are presented and are more protective than the closest psychophysically determined equivalents.
Monte Carlo simulation of the risk of contamination of apples with Escherichia coli O157:H7.
Duffy, Siobain; Schaffner, Donald W
2002-10-25
Quantitative descriptions of the frequency and extent of contamination of apple cider with pathogenic bacteria were obtained using literature data and computer simulation. Probability distributions were chosen to describe the risk of apple contamination by each suspected pathway. Tree-picked apples may be contaminated by birds infected with Escherichia coli O157:H7 when orchards were located near a sewage source (ocean or landfill). Dropped apples could become contaminated from either infected animal droppings or from contaminated manure if used as fertilizer. A risk assessment model was created in Analytica. The results of worst-case simulations revealed that 6-9 log CFU E. coli O157:H7 might be found on a harvest of 1000 dropped apples, while 3-4 log CFU contamination could be present on 1000 tree-picked apples. This model confirms that practices such as using dropped apples and using animal waste as fertilizer increase risk in the production of apple cider, and that pasteurization may not eliminate all contamination in juice from heavily contaminated fruit. Recently published FDA regulations for juices requiring a 5-log CFU/ml reduction of pathogenic bacteria in fresh juices should be a fail-safe measure for apples harvested in all but the worst-case scenarios.
Toth, Linda A; Trammell, Rita A; Liberati, Teresa; Verhulst, Steve; Hart, Marcia L; Moskowitz, Jacob E; Franklin, Craig
2017-01-01
Shift work (SW) is viewed as a risk factor for the development of many serious health conditions, yet prospective studies that document such risks are rare. The current study addressed this void by testing the hypothesis that long-term exposure to repeated diurnal phase shifts, mimicking SW, will accelerate disease onset or death in inbred mice with genetic risk of developing cancer, diabetes, or autoimmune disease. The data indicate that 1) life-long exposure to simulated SW accelerates death in female cancer-prone AKR/J mice; 2) a significant proportion of male NON/ShiLtJ mice, which have impaired glucose tolerance but do not normally progress to type 2 diabetes, develop hyperglycemia, consistent with diabetes (that is, blood glucose 250 mg/dL or greater) after exposure to simulated SW for 8 wk; and 3) MRL/MpJ mice, which are prone to develop autoimmune disease, showed sex-related acceleration of disease development when exposed to SW as compared with mice maintained on a stable photocycle. Thus, long-term exposure to diurnal phase shifts that mimic SW reduces health or longevity in a wide variety of disease models. Our approach provides a simple way to assess the effect of chronic diurnal disruption in disease development in at-risk genotypes. PMID:28381312
Mid-term financial impact of animal welfare improvements in Dutch broiler production.
Gocsik, E; Lansink, A G J M Oude; Saatkamp, H W
2013-12-01
This study used a stochastic bioeconomic simulation model to simulate the business and financial risk of different broiler production systems over a 5-yr period. Simulation analysis was conducted using the @Risk add-in in MS Excel. To compare the impact of different production systems on economic feasibility, 2 cases were considered. The first case focused on the economic feasibility of a completely new system, whereas the second examined economic feasibilities when a farm switches from a conventional to an animal welfare-improving production system. A sensitivity analysis was conducted to assess the key drivers of economic feasibility and to reveal systematic differences across production systems. The study shows that economic feasibility of systems with improved animal welfare predominantly depends on the price that farmers receive. Moreover, the study demonstrates the importance of the level and variation of the price premium for improved welfare, particularly in the first 5 yr after conversion. The economic feasibility of the production system increases with the level of welfare improvements for a sufficiently high price level for broiler meat and low volatility in producer prices. If this is not the case, however, risk attitudes of farmers become important as well as the use of potential risk management instruments.
Contribution of future urbanisation expansion to flood risk changes
NASA Astrophysics Data System (ADS)
Bruwier, Martin; Mustafa, Ahmed; Archambeau, Pierre; Erpicum, Sébastien; Pirotton, Michel; Teller, Jacques; Dewals, Benjamin
2016-04-01
The flood risk is expected to increase in the future due to climate change and urban development. Climate change modifies flood hazard and urban development influences exposure and vulnerability to floods. While the influence of climate change on flood risk has been studied widely, the impact of urban development also needs to be considered in a sustainable flood risk management approach. The main goal of this study is the determination of the sensitivity of future flood risk to different urban development scenarios at a relatively short-time horizon in the River Meuse basin in Wallonia (Belgium). From the different scenarios, the expected impact of urban development on flood risk is assessed. Three urban expansion scenarios are developed up to 2030 based on a coupled cellular automata (CA) and agent-based (AB) urban expansion model: (i) business-as-usual, (ii) restrictive and (iii) extreme expansion scenarios. The main factor controlling these scenarios is the future urban land demand. Each urban expansion scenario is developed by considering or not high and/or medium flood hazard zones as a constraint for urban development. To assess the model's performance, it is calibrated for the Meuse River valley (Belgium) to simulate urban expansion between 1990 and 2000. Calibration results are then assessed by comparing the 2000 simulated land-use map and the actual 2000 land-use map. The flood damage estimation for each urban expansion scenario is determined for five flood discharges by overlaying the inundation map resulting from a hydraulic computation and the urban expansion map and by using damage curves and specific prices. The hydraulic model Wolf2D has been extensively validated by comparisons between observations and computational results during flood event .This study focuses only on mobile and immobile prices for urban lands, which are associated to the most severe damages caused by floods along the River Meuse. These findings of this study offers tools to drive urban expansion based on numerous policies visions to mitigate future flood risk along the Meuse River. In particular, we assess the impacts on future flood risk of the prohibition of urban development in high and/or medium flood hazard zones. Acknowledgements The research was funded through the ARC grant for Concerted Research Actions, financed by the Wallonia-Brussels Federation.
Multi-hazard risk analysis using the FP7 RASOR Platform
NASA Astrophysics Data System (ADS)
Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew
2014-10-01
Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.
NASA Astrophysics Data System (ADS)
Jing, Wenjun; Zhao, Yan
2018-02-01
Stability is an important part of geotechnical engineering research. The operating experiences of underground storage caverns in salt rock all around the world show that the stability of the caverns is the key problem of safe operation. Currently, the combination of theoretical analysis and numerical simulation are the mainly adopts method of reserve stability analysis. This paper introduces the concept of risk into the stability analysis of underground geotechnical structure, and studies the instability of underground storage cavern in salt rock from the perspective of risk analysis. Firstly, the definition and classification of cavern instability risk is proposed, and the damage mechanism is analyzed from the mechanical angle. Then the main stability evaluating indicators of cavern instability risk are proposed, and an evaluation method of cavern instability risk is put forward. Finally, the established cavern instability risk assessment system is applied to the analysis and prediction of cavern instability risk after 30 years of operation in a proposed storage cavern group in the Huai’an salt mine. This research can provide a useful theoretical base for the safe operation and management of underground storage caverns in salt rock.
Decompression scenarios in a new underground transportation system.
Vernez, D
2000-10-01
The risks of a public exposure to a sudden decompression, until now, have been related to civil aviation and, at a lesser extent, to diving activities. However, engineers are currently planning the use of low pressure environments for underground transportation. This method has been proposed for the future Swissmetro, a high-speed underground train designed for inter-urban linking in Switzerland. The use of a low pressure environment in an underground public transportation system must be considered carefully regarding the decompression risks. Indeed, due to the enclosed environment, both decompression kinetics and safety measures may differ from aviation decompression cases. A theoretical study of decompression risks has been conducted at an early stage of the Swissmetro project. A three-compartment theoretical model, based on the physics of fluids, has been implemented with flow processing software (Ithink 5.0). Simulations have been conducted in order to analyze "decompression scenarios" for a wide range of parameters, relevant in the context of the Swissmetro main study. Simulation results cover a wide range from slow to explosive decompression, depending on the simulation parameters. Not surprisingly, the leaking orifice area has a tremendous impact on barotraumatic effects, while the tunnel pressure may significantly affect both hypoxic and barotraumatic effects. Calculations have also shown that reducing the free space around the vehicle may mitigate significantly an accidental decompression. Numeric simulations are relevant to assess decompression risks in the future Swissmetro system. The decompression model has proven to be useful in assisting both design choices and safety management.
Designs for Risk Evaluation and Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO 2) under the U.S. Department of Energy's National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO 2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO 2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format.more » The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user's manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.« less
Zhang, Ying; Liu, Yuanyuan; Niu, Zhiguang; Jin, Shaopei
2017-05-01
To estimate the ecological risk of toxic organic pollutant (formaldehyde) and heavy metals (mercury (Hg), arsenic (As), cadmium (Cd), and chromium (Cr)) in water and sediment from a landscape Lake in Tianjin City, an ecological risk assessment was performed. The risk quotient (RQ) method and the AQUATOX model were used to assess the ecological risk of formaldehyde in landscape water. Meanwhile, the RQ method and the potential ecological risk index method were used to assess the ecological risk of four heavy metals in water and sediment from the studied landscape lake, respectively. The results revealed that the maximum concentration of formaldehyde in landscape water was lower than the environmental quality standards of surface water in China. The maximum simulated concentrations of formaldehyde in phytoplankton and invertebrates were 3.15 and 22.91 μg/L, respectively, which were far less than its toxicity data values (1000 and 510 μg/L, respectively), suggesting that formaldehyde in landscape water was at a safe level for aquatic organisms. The RQ model indicated that the risks of phytoplankton and invertebrates were higher than that of fish posed by Hg and Cd in landscape water, and the risks from As and Cr were acceptable for all test organisms. Cd is the most important pollution factor among all heavy metals in sediment from studied landscape lake, and the pollution factor sequence of heavy metals was Hg > As > Cr > Cd. The values of risk index (RI) for four heavy metals in samples a and b were 43.48 and 72.66, which were much lower than the threshold value (150), suggesting that the ecological risk posed by heavy metals in sediment was negligible.
Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions
NASA Astrophysics Data System (ADS)
Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.
2015-07-01
The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable shoreline risk levels from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - Portuguese Continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time. Shoreline risks can be computed in real-time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns, "hot spots" or developing sensitivity analysis to specific conditions, whereas real time risk levels can be used in the prioritization of individual ships, geographical areas, strategic tug positioning and implementation of dynamic risk-based vessel traffic monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallegos, A.F.; Gonzales, G.J.; Bennett, K.D.
The Record of Decision on the Dual Axis Radiographic Hydrodynamic Test Facility at the Los Alamos National Laboratory requires that the Department of Energy takes special precautions to protect the Mexican Spotted Owl (Strix occidentalis lucida). In order to do so, risk to the owl presented by radiological and nonradiological contaminants must be estimated. A preliminary risk assessment on the Mexican Spotted Owl in two Ecological Exposure Units (EEUs) was performed using a modified Environmental Protection Agency Quotient method, the FORTRAN model ECORSK4, and a geographic information system. Estimated doses to the owl under a spatially-weighted foraging regime were comparedmore » against toxicological reference doses generating hazard indices (HIs) and hazard quotients (HQs) for three risk source types. The average HI was 0.20 for EEU-21 and 0.0015 for EEU-40. Under the risk parameter assumptions made, hazard quotient results indicated no unacceptable risk to the owl, including a measure of cumulative effects from multiple contaminants that assumes a linear additive toxicity type. An HI of 1.0 was used as the evaluative criteria for determining the acceptability of risk. This value was exceeded (1.06) in only one of 200 simulated potential nest sites. Cesium-137, Ni, {sup 239}Pu, Al and {sup 234}U we`re among the constituents with the highest partial HQs. Improving model realism by weighting simulated owl foraging based on distance from potential nest sites decreased the estimated risk by 72% (0.5 HI units) for EEU-21 and by 97.6% (6.3E-02 HI units) for EEU-40. Information on risk by specific geographical location was generated, which can be used to manage contaminated areas, owl habitat, facility siting, and/or facility operations in order to maintain risk from contaminants at acceptably low levels.« less
The U.S. EPA must consider thousands of chemicals when allocating resources to assess risk in human populations and the environment. High-throughput screening assays to characterize biological activity in vitro are being implemented in the ToxCastTM program to rapidly characteri...
Population structure and life history strategies are determinants of how populations respond to stressor-induced impairments in organism-level responses, but a consistent and holistic analysis has not been reported. Effects on population growth rate were modeled using seven theor...
SIMULATING METABOLISM OF XENOBIOTIC CHEMICALS AS A PREDICTOR OF TOXICITY
EPA is faced with long lists of chemicals that need to be assessed for hazard. A major gap in evaluating chemical risk is accounting for metabolic activation resulting in increased toxicity. The goals of this project are to develop a capability to forecast the metabolism of xenob...
Simulation of Longitudinal Exposure Data with Variance-Covariance Structures Based on Mixed Models
Longitudinal data are important in exposure and risk assessments, especially for pollutants with long half-lives in the human body and where chronic exposures to current levels in the environment raise concerns for human health effects. It is usually difficult and expensive to ob...
Air quality (AQ) simulation models provide a basis for implementing the National Ambient Air Quality Standards (NAAQS) and are a tool for performing risk-based assessments and for developing environmental management strategies. Fine particulate matter (PM 2.5), its constituent...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin Leigh; Veeraraghavan, Swetha; Bolisetti, Chandrakanth
MASTODON has the capability to model stochastic nonlinear soil-structure interaction (NLSSI) in a dynamic probabilistic risk assessment framework. The NLSSI simulations include structural dynamics, time integration, dynamic porous media flow, nonlinear hysteretic soil constitutive models, geometric nonlinearities (gapping, sliding, and uplift). MASTODON is also the MOOSE based master application for dynamic PRA of external hazards.
NASA Astrophysics Data System (ADS)
Kloss, Sebastian; Schuetze, Niels; Schmitz, Gerd H.
2010-05-01
The strong competition for fresh water in order to fulfill the increased demand for food worldwide has led to a renewed interest in techniques to improve water use efficiency (WUE) such as controlled deficit irrigation. Furthermore, as the implementation of crop models into complex decision support systems becomes more and more common, it is imperative to reliably predict the WUE as ratio of water consumption and yield. The objective of this paper is the assessment of the problems the crop models - such as FAO-33, DAISY, and APSIM in this study - face when maximizing the WUE. We applied these crop models for calculating the risk in yield reduction in view of different sources of uncertainty (e.g. climate) employing a stochastic framework for decision support for the planning of water supply in irrigation. The stochastic framework consists of: (i) a weather generator for simulating regional impacts of climate change; (ii) a new tailor-made evolutionary optimization algorithm for optimal irrigation scheduling with limited water supply; and (iii) the above mentioned models for simulating water transport and crop growth in a sound manner. The results present stochastic crop water production functions (SCWPF) for different crops which can be used as basic tools for assessing the impact of climate variability on the risk for the potential yield. Case studies from India, Oman, Malawi, and France are presented to assess the differences in modeling water stress and yield response for the different crop models.
Landslide-Generated Tsunami Model for Quick Hazard Assessment
NASA Astrophysics Data System (ADS)
Franz, M.; Rudaz, B.; Locat, J.; Jaboyedoff, M.; Podladchikov, Y.
2015-12-01
Alpine regions are likely to be areas at risk regarding to landslide-induced tsunamis, because of the proximity between lakes and potential instabilities and due to the concentration of the population in valleys and on the lakes shores. In particular, dam lakes are often surrounded by steep slopes and frequently affect the stability of the banks. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a 2.5D numerical model which aims to simulate the propagation of the landslide, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. To perform this task, the process is done in three steps. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The proper behavior of our model is demonstrated by; (1) numerical tests from Toro (2001), and (2) by comparison with a real event where the horizontal run-up distance is known (Nicolet landslide, Quebec, Canada). The model is of particular interest due to its ability to perform quickly the 2.5D geometric model of the landslide, the tsunami simulation and, consequently, the hazard assessment.
Assessing landscape scale wildfire exposure for highly valued resources in a Mediterranean area.
Alcasena, Fermín J; Salis, Michele; Ager, Alan A; Arca, Bachisio; Molina, Domingo; Spano, Donatella
2015-05-01
We used a fire simulation modeling approach to assess landscape scale wildfire exposure for highly valued resources and assets (HVR) on a fire-prone area of 680 km(2) located in central Sardinia, Italy. The study area was affected by several wildfires in the last half century: some large and intense fire events threatened wildland urban interfaces as well as other socioeconomic and cultural values. Historical wildfire and weather data were used to inform wildfire simulations, which were based on the minimum travel time algorithm as implemented in FlamMap. We simulated 90,000 fires that replicated recent large fire events in the area spreading under severe weather conditions to generate detailed maps of wildfire likelihood and intensity. Then, we linked fire modeling outputs to a geospatial risk assessment framework focusing on buffer areas around HVR. The results highlighted a large variation in burn probability and fire intensity in the vicinity of HVRs, and allowed us to identify the areas most exposed to wildfires and thus to a higher potential damage. Fire intensity in the HVR buffers was mainly related to fuel types, while wind direction, topographic features, and historically based ignition pattern were the key factors affecting fire likelihood. The methodology presented in this work can have numerous applications, in the study area and elsewhere, particularly to address and inform fire risk management, landscape planning and people safety on the vicinity of HVRs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sezen, Halil; Aldemir, Tunc; Denning, R.
Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.
Regional Scale Simulations of Nitrate Leaching through Agricultural Soils of California
NASA Astrophysics Data System (ADS)
Diamantopoulos, E.; Walkinshaw, M.; O'Geen, A. T.; Harter, T.
2016-12-01
Nitrate is recognized as one of California's most widespread groundwater contaminants. As opposed to point sources, which are relative easily identifiable sources of contamination, non-point sources of nitrate are diffuse and linked with widespread use of fertilizers in agricultural soils. California's agricultural regions have an incredible diversity of soils that encompass a huge range of properties. This complicates studies dealing with nitrate risk assessment, since important biological and physicochemical processes appear at the first meters of the vadose zone. The objective of this study is to evaluate all agricultural soils in California according to their potentiality for nitrate leaching based on numerical simulations using the Richards equation. We conducted simulations for 6000 unique soil profiles (over 22000 soil horizons) taking into account the effect of climate, crop type, irrigation and fertilization management scenarios. The final goal of this study is to evaluate simple management methods in terms of reduced nitrate leaching. We estimated drainage rates of water under the root zone and nitrate concentrations in the drain water at the regional scale. We present maps for all agricultural soils in California which can be used for risk assessment studies. Finally, our results indicate that adoption of simple irrigation and fertilization methods may significantly reduce nitrate leaching in vulnerable regions.
PTSD, Acute Stress, Performance and Decision-Making in Emergency Service Workers.
Regehr, Cheryl; LeBlanc, Vicki R
2017-06-01
Despite research identifying high levels of stress and traumatic stress symptoms among those in the emergency services, the impact of these symptoms on performance and hence public safety remains uncertain. This review paper discusses a program of research that has examined the effects of prior critical incident exposure, acute stress, and current post-traumatic symptoms on the performance and decision-making during an acutely stressful event among police officers, police communicators, paramedics and child protection workers. Four studies, using simulation methods involving video simulators, human-patient simulators, and/or standardized patients, examined the performance of emergency workers in typical workplace situations related to their individual profession. Results varied according to level of acuity of stress and the nature of performance and decision-making. There was no evidence that PTSD had a direct impact on global performance on tasks for which emergency responders are highly trained. However, PTSD was associated with assessment of risk in situations that required professional judgement. Further, individuals experiencing PTSD symptoms reported higher levels of acute stress when faced with high acuity situations. Acute stress in these studies was associated with performance deficits on complex cognitive tasks, verbal memory impairment and heightened assessment of risk. © 2017 American Academy of Psychiatry and the Law.
Pisani, J.M.; Grant, W.E.; Mora, M.A.
2008-01-01
We present a simulation model for risk assessment of the impact of insecticide inhibitors of cholinesterase (ChE) applied in irrigated agricultural fields on non-target wildlife. The model, which we developed as a compartment model based on difference equations (??t = 1 h), consists of six submodels describing the dynamics of (1) insecticide application, (2) insecticide movement into floodable soil, (3) irrigation and rain, (4) insecticide dissolution in water, (5) foraging and insecticide intake from water, and (6) ChE inhibition and recovery. To demonstrate application of the model, we simulated historical and "worst-case" scenarios of the impact of ChE-inhibiting insecticides on white-winged doves (Zenaida asiatica) inhabiting natural brushland adjacent to cotton and sugarcane fields in the Lower Rio Grande Valley of Texas, USA. Only when a rain event occurred just after insecticide application did predicted levels of ChE inhibition surpass the diagnostic level of 20% exposure. The present model should aid in assessing the effect of ChE-inhibiting insecticides on ChE activity of different species that drink contaminated water from irrigated agricultural fields, and in identifying specific situations in which the juxtaposition of environmental conditions and management schemes could result in a high risk to non-target wildlife. ?? 2007 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Rybarski, S.; Pohll, G.; Pohlmann, K.; Plume, R.
2014-12-01
In recent years, hydraulic fracturing (fracking) has become an increasingly popular method for extraction of oil and natural gas from tight formations. Concerns have been raised over a number of environmental risks associated with fracking, including contamination of groundwater by fracking fluids, upwelling of deep subsurface brines, and methane migration. Given the potentially long time scale for contaminant transport associated with hydraulic fracturing, numerical modeling remains the best practice for risk assessment. Oil shale in the Humboldt basin of northeastern Nevada has now become a target for hydraulic fracturing operations. Analysis of regional and shallow groundwater flow is used to assess several potential migration pathways specific to the geology and hydrogeology of this basin. The model domain in all simulations is defined by the geologic structure of the basin as determined by deep oil and gas well bores and formation outcrops. Vertical transport of gaseous methane along a density gradient is simulated in TOUGH2, while fluid transport along faults and/or hydraulic fractures and lateral flow through more permeable units adjacent to the targeted shale are modeled in FEFLOW. Sensitivity analysis considers basin, fault, and hydraulic fracturing parameters, and results highlight key processes that control fracking fluid and methane migration and time scales under which it might occur.
Farnan, Jeanne M; Gaffney, Sean; Poston, Jason T; Slawinski, Kris; Cappaert, Melissa; Kamin, Barry; Arora, Vineet M
2016-03-01
Patient safety curricula in undergraduate medical education (UME) are often didactic format with little focus on skills training. Despite recent focus on safety, practical training in residency education is also lacking. Assessments of safety skills in UME and graduate medical education (GME) are generally knowledge, and not application-focused. We aimed to develop and pilot a safety-focused simulation with medical students and interns to assess knowledge regarding hazards of hospitalisation. A simulation demonstrating common hospital-based safety threats was designed. A case scenario was created including salient patient information and simulated safety threats such as the use of upper-extremity restraints and medication errors. After entering the room and reviewing the mock chart, learners were timed and asked to identify and document as many safety hazards as possible. Learner satisfaction was assessed using constructed-response evaluation. Descriptive statistics, including per cent correct and mean correct hazards, were performed. All 86 third-year medical students completed the encounter. Some hazards were identified by a majority of students (fall risk, 83% of students) while others were rarely identified (absence of deep venous thrombosis prophylaxis, 13% of students). Only 5% of students correctly identified pressure ulcer risk. 128 of 131 interns representing 49 medical schools participated in the GME implementation. Incoming interns were able to identify a mean of 5.1 hazards out of the 9 displayed (SD 1.4) with 40% identifying restraints as a hazard, and 20% identifying the inappropriate urinary catheter as a hazard. A simulation showcasing safety hazards was a feasible and effective way to introduce trainees to safety-focused content. Both students and interns had difficulty identifying common hazards of hospitalisation. Despite poor performance, learners appreciated the interactive experience and its clinical utility. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Delaney, C.; Hartman, R. K.; Mendoza, J.; Evans, K. M.; Evett, S.
2016-12-01
Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation or flow forecasts to inform the flood operations of reservoirs. Previous research and modeling for flood control reservoirs has shown that FIRO can reduce flood risk and increase water supply for many reservoirs. The risk-based method of FIRO presents a unique approach that incorporates flow forecasts made by NOAA's California-Nevada River Forecast Center (CNRFC) to model and assess risk of meeting or exceeding identified management targets or thresholds. Forecasted risk is evaluated against set risk tolerances to set reservoir flood releases. A water management model was developed for Lake Mendocino, a 116,500 acre-foot reservoir located near Ukiah, California. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United State Army Corps of Engineers and is operated by the Sonoma County Water Agency for water supply. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has been plagued with water supply reliability issues since 2007. FIRO is applied to Lake Mendocino by simulating daily hydrologic conditions from 1985 to 2010 in the Upper Russian River from Lake Mendocino to the City of Healdsburg approximately 50 miles downstream. The risk-based method is simulated using a 15-day, 61 member streamflow hindcast by the CNRFC. Model simulation results of risk-based flood operations demonstrate a 23% increase in average end of water year (September 30) storage levels over current operations. Model results show no increase in occurrence of flood damages for points downstream of Lake Mendocino. This investigation demonstrates that FIRO may be a viable flood control operations approach for Lake Mendocino and warrants further investigation through additional modeling and analysis.
Quantitative assessment of changes in landslide risk using a regional scale run-out model
NASA Astrophysics Data System (ADS)
Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone
2015-04-01
The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors. The risk was calculated by multiplying the vulnerability with the spatial probability and the building values. Changes in landslide risk was assessed using the loss estimation of four different periods: (1) pre-August 2003 disaster, (2) the August 2003 event, (3) post-August 2003 to 2011 and (4) smaller frequent events occurring between the entire 1996-2011 period. One of the major findings of our work was the calculation of a significant decrease in landslide risk after the 2003 disaster compared to the pre-disaster risk period. This indicates the importance of estimating risk after a few years of a major event in order to avoid overestimation or exaggeration of future losses.
Hommen, Udo; Schmitt, Walter; Heine, Simon; Brock, Theo Cm; Duquesne, Sabine; Manson, Phil; Meregalli, Giovanna; Ochoa-Acuña, Hugo; van Vliet, Peter; Arts, Gertie
2016-01-01
This case study of the Society of Environmental Toxicology and Chemistry (SETAC) workshop MODELINK demonstrates the potential use of mechanistic effects models for macrophytes to extrapolate from effects of a plant protection product observed in laboratory tests to effects resulting from dynamic exposure on macrophyte populations in edge-of-field water bodies. A standard European Union (EU) risk assessment for an example herbicide based on macrophyte laboratory tests indicated risks for several exposure scenarios. Three of these scenarios are further analyzed using effect models for 2 aquatic macrophytes, the free-floating standard test species Lemna sp., and the sediment-rooted submerged additional standard test species Myriophyllum spicatum. Both models include a toxicokinetic (TK) part, describing uptake and elimination of the toxicant, a toxicodynamic (TD) part, describing the internal concentration-response function for growth inhibition, and a description of biomass growth as a function of environmental factors to allow simulating seasonal dynamics. The TK-TD models are calibrated and tested using laboratory tests, whereas the growth models were assumed to be fit for purpose based on comparisons of predictions with typical growth patterns observed in the field. For the risk assessment, biomass dynamics are predicted for the control situation and for several exposure levels. Based on specific protection goals for macrophytes, preliminary example decision criteria are suggested for evaluating the model outputs. The models refined the risk indicated by lower tier testing for 2 exposure scenarios, while confirming the risk associated for the third. Uncertainties related to the experimental and the modeling approaches and their application in the risk assessment are discussed. Based on this case study and the assumption that the models prove suitable for risk assessment once fully evaluated, we recommend that 1) ecological scenarios be developed that are also linked to the exposure scenarios, and 2) quantitative protection goals be set to facilitate the interpretation of model results for risk assessment. © 2015 SETAC.
Assessment of active methods for removal of LEO debris
NASA Astrophysics Data System (ADS)
Hakima, Houman; Emami, M. Reza
2018-03-01
This paper investigates the applicability of five active methods for removal of large low Earth orbit debris. The removal methods, namely net, laser, electrodynamic tether, ion beam shepherd, and robotic arm, are selected based on a set of high-level space mission constraints. Mission level criteria are then utilized to assess the performance of each redirection method in light of the results obtained from a Monte Carlo simulation. The simulation provides an insight into the removal time, performance robustness, and propellant mass criteria for the targeted debris range. The remaining attributes are quantified based on the models provided in the literature, which take into account several important parameters pertaining to each removal method. The means of assigning attributes to each assessment criterion is discussed in detail. A systematic comparison is performed using two different assessment schemes: Analytical Hierarchy Process and utility-based approach. A third assessment technique, namely the potential-loss analysis, is utilized to highlight the effect of risks in each removal methods.
Gong, Jian; Yang, Jianxin; Tang, Wenwu
2015-11-09
Land use and land cover change is driven by multiple influential factors from environmental and social dimensions in a land system. Land use practices of human decision-makers modify the landscape of the land system, possibly leading to landscape fragmentation, biodiversity loss, or environmental pollution-severe environmental or ecological impacts. While landscape-level ecological risk assessment supports the evaluation of these impacts, investigations on how these ecological risks induced by land use practices change over space and time in response to alternative policy intervention remain inadequate. In this article, we conducted spatially explicit landscape ecological risk analysis in Ezhou City, China. Our study area is a national ecologically representative region experiencing drastic land use and land cover change, and is regulated by multiple policies represented by farmland protection, ecological conservation, and urban development. We employed landscape metrics to consider the influence of potential landscape-level disturbance for the evaluation of landscape ecological risks. Using spatiotemporal simulation, we designed scenarios to examine spatiotemporal patterns in landscape ecological risks in response to policy intervention. Our study demonstrated that spatially explicit landscape ecological risk analysis combined with simulation-driven scenario analysis is of particular importance for guiding the sustainable development of ecologically vulnerable land systems.
Gong, Jian; Yang, Jianxin; Tang, Wenwu
2015-01-01
Land use and land cover change is driven by multiple influential factors from environmental and social dimensions in a land system. Land use practices of human decision-makers modify the landscape of the land system, possibly leading to landscape fragmentation, biodiversity loss, or environmental pollution—severe environmental or ecological impacts. While landscape-level ecological risk assessment supports the evaluation of these impacts, investigations on how these ecological risks induced by land use practices change over space and time in response to alternative policy intervention remain inadequate. In this article, we conducted spatially explicit landscape ecological risk analysis in Ezhou City, China. Our study area is a national ecologically representative region experiencing drastic land use and land cover change, and is regulated by multiple policies represented by farmland protection, ecological conservation, and urban development. We employed landscape metrics to consider the influence of potential landscape-level disturbance for the evaluation of landscape ecological risks. Using spatiotemporal simulation, we designed scenarios to examine spatiotemporal patterns in landscape ecological risks in response to policy intervention. Our study demonstrated that spatially explicit landscape ecological risk analysis combined with simulation-driven scenario analysis is of particular importance for guiding the sustainable development of ecologically vulnerable land systems. PMID:26569270
Update on simulation-based surgical training and assessment in ophthalmology: a systematic review.
Thomsen, Ann Sofia S; Subhi, Yousif; Kiilgaard, Jens Folke; la Cour, Morten; Konge, Lars
2015-06-01
This study reviews the evidence behind simulation-based surgical training of ophthalmologists to determine (1) the validity of the reported models and (2) the ability to transfer skills to the operating room. Simulation-based training is established widely within ophthalmology, although it often lacks a scientific basis for implementation. We conducted a systematic review of trials involving simulation-based training or assessment of ophthalmic surgical skills among health professionals. The search included 5 databases (PubMed, EMBASE, PsycINFO, Cochrane Library, and Web of Science) and was completed on March 1, 2014. Overall, the included trials were divided into animal, cadaver, inanimate, and virtual-reality models. Risk of bias was assessed using the Cochrane Collaboration's tool. Validity evidence was evaluated using a modern validity framework (Messick's). We screened 1368 reports for eligibility and included 118 trials. The most common surgery simulated was cataract surgery. Most validity trials investigated only 1 or 2 of 5 sources of validity (87%). Only 2 trials (48 participants) investigated transfer of skills to the operating room; 4 trials (65 participants) evaluated the effect of simulation-based training on patient-related outcomes. Because of heterogeneity of the studies, it was not possible to conduct a quantitative analysis. The methodologic rigor of trials investigating simulation-based surgical training in ophthalmology is inadequate. To ensure effective implementation of training models, evidence-based knowledge of validity and efficacy is needed. We provide a useful tool for implementation and evaluation of research in simulation-based training. Copyright © 2015 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Tatham, Andrew J; Boer, Erwin R; Gracitelli, Carolina P B; Rosen, Peter N; Medeiros, Felipe A
2015-05-01
To examine the relationship between Motor Vehicle Collisions (MVCs) in drivers with glaucoma and standard automated perimetry (SAP), Useful Field of View (UFOV), and driving simulator assessment of divided attention. A cross-sectional study of 153 drivers from the Diagnostic Innovations in Glaucoma Study. All subjects had SAP and divided attention was assessed using UFOV and driving simulation using low-, medium-, and high-contrast peripheral stimuli presented during curve negotiation and car following tasks. Self-reported history of MVCs and average mileage driven were recorded. Eighteen of 153 subjects (11.8%) reported a MVC. There was no difference in visual acuity but the MVC group was older, drove fewer miles, and had worse binocular SAP sensitivity, contrast sensitivity, and ability to divide attention (UFOV and driving simulation). Low contrast driving simulator tasks were the best discriminators of MVC (AUC 0.80 for curve negotiation versus 0.69 for binocular SAP and 0.59 for UFOV). Adjusting for confounding factors, longer reaction times to driving simulator divided attention tasks provided additional value compared with SAP and UFOV, with a 1 standard deviation (SD) increase in reaction time (approximately 0.75 s) associated with almost two-fold increased odds of MVC. Reaction times to low contrast divided attention tasks during driving simulation were significantly associated with history of MVC, performing better than conventional perimetric tests and UFOV. The association between conventional tests of visual function and MVCs in drivers with glaucoma is weak, however, tests of divided attention, particularly using driving simulation, may improve risk assessment.
Skjevrak, Ingun; Brede, Cato; Steffensen, Inger-Lise; Mikalsen, Arne; Alexander, Jan; Fjeldal, Per; Herikstad, Hallgeir
2005-10-01
A procedure used by the Norwegian Food Safety Authority for surveillance of contaminants from plastic food contact materials (polyolefin drinking bottles, water boilers, polyamide cooking utensils and plastic multi-layer materials) is described. It is based on gas chromatographic-mass spectrometric (GC/MS) analysis of food simulants exposed to plastic materials. Most migrants were substances not-intentionally added to the plastic (degradation products, impurities) or originated from non-plastic components, such as printing inks, adhesives, not-listed additives, solvents and coatings. Hence, the majority of the identified migrants were regulated by the general statements in the EU Framework Regulation, which neither specify limits nor requirements regarding risk assessment, rather than by specific migration controls. Risk assessment has been carried out for selected non-authorized substances. The analysis and the management of these substances and materials with respect to safety represents a challenge to the food authorities.
A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology
Huang, Y.-N.; Whittaker, A.S.; Luco, N.
2011-01-01
A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.
Mesh-Based Entry Vehicle and Explosive Debris Re-Contact Probability Modeling
NASA Technical Reports Server (NTRS)
McPherson, Mark A.; Mendeck, Gavin F.
2011-01-01
The risk to a crewed vehicle arising from potential re-contact with fragments from an explosive breakup of any jettisoned spacecraft segments during entry has long sought to be quantified. However, great difficulty lies in efficiently capturing the potential locations of each fragment and their collective threat to the vehicle. The method presented in this paper addresses this problem by using a stochastic approach that discretizes simulated debris pieces into volumetric cells, and then assesses strike probabilities accordingly. Combining spatial debris density and relative velocity between the debris and the entry vehicle, the strike probability can be calculated from the integral of the debris flux inside each cell over time. Using this technique it is possible to assess the risk to an entry vehicle along an entire trajectory as it separates from the jettisoned segment. By decoupling the fragment trajectories from that of the entry vehicle, multiple potential separation maneuvers can then be evaluated rapidly to provide an assessment of the best strategy to mitigate the re-contact risk.
Carman, Margaret; Xu, Shu; Rushton, Sharron; Smallheer, Benjamin A; Williams, Denise; Amarasekara, Sathya; Oermann, Marilyn H
Acute care nurse practitioner (ACNP) programs that use high-fidelity simulation as a teaching tool need to consider innovative strategies to provide distance-based students with learning experiences that are comparable to those in a simulation laboratory. The purpose of this article is to describe the use of virtual simulations in a distance-based ACNP program and student performance in the simulations. Virtual simulations using iSimulate were integrated into the ACNP course to promote the translation of content into a clinical context and enable students to develop their knowledge and decision-making skills. With these simulations, students worked as a team, even though they were at different sites from each other and from the faculty, to manage care of an acutely ill patient. The students were assigned to simulation groups of 4 students each. One week before the simulation, they reviewed past medical records. The virtual simulation sessions were recorded and then evaluated. The evaluation tools assessed 8 areas of performance and included key behaviors in each of these areas to be performed by students in the simulation. More than 80% of the student groups performed the key behaviors. Virtual simulations provide a learning platform that allows live interaction between students and faculty, at a distance, and application of content to clinical situations. With simulation, learners have an opportunity to practice assessment and decision-making in emergency and high-risk situations. Simulations not only are valuable for student learning but also provide a nonthreatening environment for staff to practice, receive feedback on their skills, and improve their confidence.
A review of simulation platforms in surgery of the temporal bone.
Bhutta, M F
2016-10-01
Surgery of the temporal bone is a high-risk activity in an anatomically complex area. Simulation enables rehearsal of such surgery. The traditional simulation platform is the cadaveric temporal bone, but in recent years other simulation platforms have been created, including plastic and virtual reality platforms. To undertake a review of simulation platforms for temporal bone surgery, specifically assessing their educational value in terms of validity and in enabling transition to surgery. Systematic qualitative review. Search of the Pubmed, CINAHL, BEI and ERIC databases. Assessment of reported outcomes in terms of educational value. A total of 49 articles were included, covering cadaveric, animal, plastic and virtual simulation platforms. Cadaveric simulation is highly rated as an educational tool, but there may be a ceiling effect on educational outcomes after drilling 8-10 temporal bones. Animal models show significant anatomical variation from man. Plastic temporal bone models offer much potential, but at present lack sufficient anatomical or haptic validity. Similarly, virtual reality platforms lack sufficient anatomical or haptic validity, but with technological improvements they are advancing rapidly. At present, cadaveric simulation remains the best platform for training in temporal bone surgery. Technological advances enabling improved materials or modelling mean that in the future plastic or virtual platforms may become comparable to cadaveric platforms, and also offer additional functionality including patient-specific simulation from CT data. © 2015 John Wiley & Sons Ltd.
A Simulated Learning Environment for Teaching Medicine Dispensing Skills
Styles, Kim; Sewell, Keith; Trinder, Peta; Marriott, Jennifer; Maher, Sheryl; Naidu, Som
2016-01-01
Objective. To develop an authentic simulation of the professional practice dispensary context for students to develop their dispensing skills in a risk-free environment. Design. A development team used an Agile software development method to create MyDispense, a web-based simulation. Modeled on virtual learning environments elements, the software employed widely available standards-based technologies to create a virtual community pharmacy environment. Assessment. First-year pharmacy students who used the software in their tutorials, were, at the end of the second semester, surveyed on their prior dispensing experience and their perceptions of MyDispense as a tool to learn dispensing skills. Conclusion. The dispensary simulation is an effective tool for helping students develop dispensing competency and knowledge in a safe environment. PMID:26941437
NASA Astrophysics Data System (ADS)
Goodrich, D. C.; Clifford, T. J.; Guertin, D. P.; Sheppard, B. S.; Barlow, J. E.; Korgaonkar, Y.; Burns, I. S.; Unkrich, C. C.
2016-12-01
Wildfires disasters are common throughout the western US. While many feel fire suppression is the largest cost of wildfires, case studies note rehabilitation costs often equal or greatly exceed suppression costs. Using geospatial data sets, and post-fire burn severity products, coupled with the Automated Geospatial Watershed Assessment tool (AGWA - www.tucson.ars.ag.gov/agwa), the Dept. of Interior, Burned Area Emergency Response (BAER) teams can rapidly analyze and identify at-risk areas to target rehabilitation efforts. AGWA employs nationally available geospatial elevation, soils, and land cover data to parameterize the KINEROS2 hydrology and erosion model. A pre-fire watershed simulation can be done prior to BAER deployment using design storms. As soon as the satellite-derived Burned Area Reflectance Classification (BARC) map is obtained, a post-fire watershed simulation using the same storm is conducted. The pre- and post-fire simulations can be spatially differenced in the GIS for rapid identification of high at-risk areas of erosion or flooding. This difference map is used by BAER teams to prioritize field observations and in-turn produce a final burn severity map that is used in AGWA/KINEROS2 simulations to provide report ready results. The 2013 Elk Wildfire Complex that burned over 52,600 ha east of Boise, Idaho provides a tangible example of how BAER experts combined AGWA and geospatial data that resulted in substantial rehabilitation cost savings. The BAER team initially, they identified approximately 6,500 burned ha for rehabilitation. The team then used the AGWA pre- and post-fire watershed simulation results, accessibility constraints, and land slope conditions in an interactive process to locate burned areas that posed the greatest threat to downstream values-at-risk. The group combined the treatable area, field observations, and the spatial results from AGWA to target seed and mulch treatments that most effectively reduced the threats. Using this process, the BAER Team reduced the treatable acres from the original 16,000 ha to between 800 and 1,600 ha depending on the selected alternative. The final awarded contract amounted to about 1,480/ha, therefore, a total savings of 7.2 - $8.4 million was realized for mulch treatment alone.
Sanaa, Moez; Coroller, Louis; Cerf, Olivier
2004-04-01
This article reports a quantitative risk assessment of human listeriosis linked to the consumption of soft cheeses made from raw milk. Risk assessment was based on data purposefully acquired inclusively over the period 2000-2001 for two French cheeses, namely: Camembert of Normandy and Brie of Meaux. Estimated Listeria monocytogenes concentration in raw milk was on average 0.8 and 0.3 cells/L, respectively, in Normandy and Brie regions. A Monte Carlo simulation was used to account for the time-temperature history of the milk and cheeses from farm to table. It was assumed that cell progeny did not spread within the solid cheese matrix (as they would be free to do in liquid broth). Interaction between pH and temperature was accounted for in the growth model. The simulated proportion of servings with no L. monocytogenes cell was 88% for Brie and 82% for Camembert. The 99th percentile of L. monocytogenes cell numbers in servings of 27 g of cheese was 131 for Brie and 77 for Camembert at the time of consumption, corresponding respectively to three and five cells of L. monocytogenes per gram. The expected number of severe listeriosis cases would be < or =10(-3) and < or =2.5 x 10(-3) per year for 17 million servings of Brie of Meaux and 480 million servings of Camembert of Normandy, respectively.