Sample records for existing risk simulations

  1. Integrating pixel- and polygon-based approaches to wildfire risk assessment: Application to a high-value watershed on the Pike and San Isabel National Forests, Colorado, USA

    Treesearch

    Matthew P. Thompson; Julie W. Gilbertson-Day; Joe H. Scott

    2015-01-01

    We develop a novel risk assessment approach that integrates complementary, yet distinct, spatial modeling approaches currently used in wildfire risk assessment. Motivation for this work stems largely from limitations of existing stochastic wildfire simulation systems, which can generate pixel-based outputs of fire behavior as well as polygon-based outputs of simulated...

  2. Ergonomics and simulation-based approach in improving facility layout

    NASA Astrophysics Data System (ADS)

    Abad, Jocelyn D.

    2018-02-01

    The use of the simulation-based technique in facility layout has been a choice in the industry due to its convenience and efficient generation of results. Nevertheless, the solutions generated are not capable of addressing delays due to worker's health and safety which significantly impact overall operational efficiency. It is, therefore, critical to incorporate ergonomics in facility design. In this study, workstation analysis was incorporated into Promodel simulation to improve the facility layout of a garment manufacturing. To test the effectiveness of the method, existing and improved facility designs were measured using comprehensive risk level, efficiency, and productivity. Results indicated that the improved facility layout generated a decrease in comprehensive risk level and rapid upper limb assessment score; an increase of 78% in efficiency and 194% increase in productivity compared to existing design and thus proved that the approach is effective in attaining overall facility design improvement.

  3. Modelling the economic impact of three lameness causing diseases using herd and cow level evidence.

    PubMed

    Ettema, Jehan; Østergaard, Søren; Kristensen, Anders Ringgaard

    2010-06-01

    Diseases to the cow's hoof, interdigital skin and legs are highly prevalent and of large economic impact in modern dairy farming. In order to support farmer's decisions on preventing and treating lameness and its underlying causes, decision support models can be used to predict the economic profitability of such actions. An existing approach of modelling lameness as one health disorder in a dynamic, stochastic and mechanistic simulation model has been improved in two ways. First of all, three underlying diseases causing lameness were modelled: digital dermatitis, interdigital hyperplasia and claw horn diseases. Secondly, the existing simulation model was set-up in way that it uses hyper-distributions describing diseases risk of the three lameness causing diseases. By combining information on herd level risk factors with prevalence of lameness or prevalence of underlying diseases among cows, marginal posterior probability distributions for disease prevalence in the specific herd are created in a Bayesian network. Random draws from these distributions are used by the simulation model to describe disease risk. Hereby field data on prevalence is used systematically and uncertainty around herd specific risk is represented. Besides the fact that estimated profitability of halving disease risk depended on the hyper-distributions used, the estimates differed for herds with different levels of diseases risk and reproductive efficiency. (c) 2010 Elsevier B.V. All rights reserved.

  4. Fish attraction to artificial reefs not always harmful: a simulation study.

    PubMed

    Smith, James A; Lowry, Michael B; Suthers, Iain M

    2015-10-01

    The debate on whether artificial reefs produce new fish or simply attract existing fish biomass continues due to the difficulty in distinguishing these processes, and there remains considerable doubt as to whether artificial reefs are a harmful form of habitat modification. The harm typically associated with attraction is that fish will be easier to harvest due to the existing biomass aggregating at a newly deployed reef. This outcome of fish attraction has not progressed past an anecdotal form, however, and is always perceived as a harmful process. We present a numerical model that simulates the effect that a redistributed fish biomass, due to an artificial reef, has on fishing catch per unit effort (CPUE). This model can be used to identify the scenarios (in terms of reef, fish, and harvest characteristics) that pose the most risk of exploitation due to fish attraction. The properties of this model were compared to the long-standing predictions by Bohnsack (1989) on the factors that increase the risk or the harm of attraction. Simulations revealed that attraction is not always harmful because it does not always increase maximum fish density. Rather, attraction sometimes disperses existing fish biomass making them harder to catch. Some attraction can be ideal, with CPUE lowest when attraction leads to an equal distribution of biomass between natural and artificial reefs. Simulations also showed that the outcomes from attraction depend on the characteristics of the target fish species, such that transient or pelagic species are often at more risk of harmful attraction than resident species. Our findings generally agree with Bohnsack's predictions, although we recommend distinguishing "mobility" and "fidelity" when identifying species most at risk from attraction, as these traits had great influence on patterns of harvest of attracted fish biomass.

  5. Quantitative risk assessment integrated with process simulator for a new technology of methanol production plant using recycled CO₂.

    PubMed

    Di Domenico, Julia; Vaz, Carlos André; de Souza, Maurício Bezerra

    2014-06-15

    The use of process simulators can contribute with quantitative risk assessment (QRA) by minimizing expert time and large volume of data, being mandatory in the case of a future plant. This work illustrates the advantages of this association by integrating UNISIM DESIGN simulation and QRA to investigate the acceptability of a new technology of a Methanol Production Plant in a region. The simulated process was based on the hydrogenation of chemically sequestered carbon dioxide, demanding stringent operational conditions (high pressures and temperatures) and involving the production of hazardous materials. The estimation of the consequences was performed using the PHAST software, version 6.51. QRA results were expressed in terms of individual and social risks. Compared to existing tolerance levels, the risks were considered tolerable in nominal conditions of operation of the plant. The use of the simulator in association with the QRA also allowed testing the risk in new operating conditions in order to delimit safe regions for the plant. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Simulating dispersal of reintroduced species within heterogeneous landscapes

    Treesearch

    Robert H. Gardner; Eric J. Gustafson

    2004-01-01

    This paper describes the development and application of a spatially explicit, individual based model of animal dispersal (J-walk) to determine the relative effects of landscape heterogeneity, prey availability, predation risk, and the energy requirements and behavior of dispersing organisms on dispersal success. Significant unknowns exist for the simulation of complex...

  7. SIMULATING URBAN AIR TOXICS OVER CONTINENTAL AND URBAN SCALES

    EPA Science Inventory

    The US EPA is evaluating a version of the CMAQ model to support risk assessment for the exposure to Hazardous Air Pollutants (HAPs). The model uses a variant of the CB4 chemical mechanism to simulate ambient concentrations of twenty HAPs that exist primarily as gaseous compounds...

  8. A Risk-Based Framework for Assessing the Effectiveness of Stratospheric Aerosol Geoengineering

    PubMed Central

    Ferraro, Angus J.; Charlton-Perez, Andrew J.; Highwood, Eleanor J.

    2014-01-01

    Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth's surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth's surface area, containing 36% of its population and 60% of its gross domestic product. PMID:24533155

  9. The effects of recall errors and of selection bias in epidemiologic studies of mobile phone use and cancer risk.

    PubMed

    Vrijheid, Martine; Deltour, Isabelle; Krewski, Daniel; Sanchez, Marie; Cardis, Elisabeth

    2006-07-01

    This paper examines the effects of systematic and random errors in recall and of selection bias in case-control studies of mobile phone use and cancer. These sensitivity analyses are based on Monte-Carlo computer simulations and were carried out within the INTERPHONE Study, an international collaborative case-control study in 13 countries. Recall error scenarios simulated plausible values of random and systematic, non-differential and differential recall errors in amount of mobile phone use reported by study subjects. Plausible values for the recall error were obtained from validation studies. Selection bias scenarios assumed varying selection probabilities for cases and controls, mobile phone users, and non-users. Where possible these selection probabilities were based on existing information from non-respondents in INTERPHONE. Simulations used exposure distributions based on existing INTERPHONE data and assumed varying levels of the true risk of brain cancer related to mobile phone use. Results suggest that random recall errors of plausible levels can lead to a large underestimation in the risk of brain cancer associated with mobile phone use. Random errors were found to have larger impact than plausible systematic errors. Differential errors in recall had very little additional impact in the presence of large random errors. Selection bias resulting from underselection of unexposed controls led to J-shaped exposure-response patterns, with risk apparently decreasing at low to moderate exposure levels. The present results, in conjunction with those of the validation studies conducted within the INTERPHONE study, will play an important role in the interpretation of existing and future case-control studies of mobile phone use and cancer risk, including the INTERPHONE study.

  10. 3D Simulation of External Flooding Events for the RISMC Pathway

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad

    2015-09-01

    Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less

  11. Existence and Optimality Conditions for Risk-Averse PDE-Constrained Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kouri, Drew Philip; Surowiec, Thomas M.

    Uncertainty is ubiquitous in virtually all engineering applications, and, for such problems, it is inadequate to simulate the underlying physics without quantifying the uncertainty in unknown or random inputs, boundary and initial conditions, and modeling assumptions. Here in this paper, we introduce a general framework for analyzing risk-averse optimization problems constrained by partial differential equations (PDEs). In particular, we postulate conditions on the random variable objective function as well as the PDE solution that guarantee existence of minimizers. Furthermore, we derive optimality conditions and apply our results to the control of an environmental contaminant. Lastly, we introduce a new riskmore » measure, called the conditional entropic risk, that fuses desirable properties from both the conditional value-at-risk and the entropic risk measures.« less

  12. Existence and Optimality Conditions for Risk-Averse PDE-Constrained Optimization

    DOE PAGES

    Kouri, Drew Philip; Surowiec, Thomas M.

    2018-06-05

    Uncertainty is ubiquitous in virtually all engineering applications, and, for such problems, it is inadequate to simulate the underlying physics without quantifying the uncertainty in unknown or random inputs, boundary and initial conditions, and modeling assumptions. Here in this paper, we introduce a general framework for analyzing risk-averse optimization problems constrained by partial differential equations (PDEs). In particular, we postulate conditions on the random variable objective function as well as the PDE solution that guarantee existence of minimizers. Furthermore, we derive optimality conditions and apply our results to the control of an environmental contaminant. Lastly, we introduce a new riskmore » measure, called the conditional entropic risk, that fuses desirable properties from both the conditional value-at-risk and the entropic risk measures.« less

  13. Risk assessment of logistics outsourcing based on BP neural network

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofeng; Tian, Zi-you

    The purpose of this article is to evaluate the risk of the enterprises logistics outsourcing. To get this goal, the paper first analysed he main risks existing in the logistics outsourcing, and then set up a risk evaluation index system of the logistics outsourcing; second applied BP neural network into the logistics outsourcing risk evaluation and used MATLAB to the simulation. It proved that the network error is small and has strong practicability. And this method can be used by enterprises to evaluate the risks of logistics outsourcing.

  14. Not Getting Burned: The Importance of Fire

    Treesearch

    Gregory S. Amacher; Arun S. Malik; Robert G. Haight

    2005-01-01

    We extend existing stand-level models of forest landowner behavior in the presence of fire risk to include the level and timing of fuel management activities. These activities reduce losses if a stand ignites. Based on simulations, we find the standard result that fire risk reduces the optimal rotation age does not hold when landowners use fuel management. Instead,...

  15. Digital imaging and remote sensing image generator (DIRSIG) as applied to NVESD sensor performance modeling

    NASA Astrophysics Data System (ADS)

    Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.

    2016-05-01

    The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.

  16. Arrhythmic risk biomarkers for the assessment of drug cardiotoxicity: from experiments to computer simulations

    PubMed Central

    Corrias, A.; Jie, X.; Romero, L.; Bishop, M. J.; Bernabeu, M.; Pueyo, E.; Rodriguez, B.

    2010-01-01

    In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of significant animal species differences in drug-induced effects on cardiac electrophysiology. Predicting drug-induced pro-arrhythmic risk solely using experiments is challenging both preclinically and clinically, as attested by the rise in the cost of releasing new compounds to the market. Computational modelling and simulation has significantly contributed to the understanding of cardiac electrophysiology and arrhythmias over the last 40 years. In the second part of this paper, we illustrate how state-of-the-art open source computational modelling and simulation tools can be used to simulate multi-scale effects of drug-induced ion channel block in ventricular electrophysiology at the cellular, tissue and whole ventricular levels for different animal species. We believe that the use of computational modelling and simulation in combination with experimental techniques could be a powerful tool for the assessment of drug safety pharmacology. PMID:20478918

  17. Simulating Runoff from a Grid Based Mercury Model: Flow Comparisons

    EPA Science Inventory

    Several mercury cycling models, including general mass balance approaches, mixed-batch reactors in streams or lakes, or regional process-based models, exist to assess the ecological exposure risks associated with anthropogenically increased atmospheric mercury (Hg) deposition, so...

  18. The impact of vehicle moving violations and freeway traffic flow on crash risk: An application of plugin development for microsimulation.

    PubMed

    Wang, Junhua; Kong, Yumeng; Fu, Ting; Stipancic, Joshua

    2017-01-01

    This paper presents the use of the Aimsun microsimulation program to simulate vehicle violating behaviors and observe their impact on road traffic crash risk. Plugins for violations of speeding, slow driving, and abrupt stopping were developed using Aimsun's API and SDK module. A safety analysis plugin for investigating probability of rear-end collisions was developed, and a method for analyzing collision risk is proposed. A Fuzzy C-mean Clustering algorithm was developed to identify high risk states in different road segments over time. Results of a simulation experiment based on the G15 Expressway in Shanghai showed that abrupt stopping had the greatest impact on increasing collision risk, and the impact of violations increased with traffic volume. The methodology allows for the evaluation and monitoring of risks, alerting of road hazards, and identification of hotspots, and could be applied to the operations of existing facilities or planning of future ones.

  19. The impact of vehicle moving violations and freeway traffic flow on crash risk: An application of plugin development for microsimulation

    PubMed Central

    Kong, Yumeng; Stipancic, Joshua

    2017-01-01

    This paper presents the use of the Aimsun microsimulation program to simulate vehicle violating behaviors and observe their impact on road traffic crash risk. Plugins for violations of speeding, slow driving, and abrupt stopping were developed using Aimsun’s API and SDK module. A safety analysis plugin for investigating probability of rear-end collisions was developed, and a method for analyzing collision risk is proposed. A Fuzzy C-mean Clustering algorithm was developed to identify high risk states in different road segments over time. Results of a simulation experiment based on the G15 Expressway in Shanghai showed that abrupt stopping had the greatest impact on increasing collision risk, and the impact of violations increased with traffic volume. The methodology allows for the evaluation and monitoring of risks, alerting of road hazards, and identification of hotspots, and could be applied to the operations of existing facilities or planning of future ones. PMID:28886141

  20. Endogenous network of firms and systemic risk

    NASA Astrophysics Data System (ADS)

    Ma, Qianting; He, Jianmin; Li, Shouwei

    2018-02-01

    We construct an endogenous network characterized by commercial credit relationships connecting the upstream and downstream firms. Simulation results indicate that the endogenous network model displays a scale-free property which exists in real-world firm systems. In terms of the network structure, with the expansion of the scale of network nodes, the systemic risk increases significantly, while the heterogeneities of network nodes have no effect on systemic risk. As for firm micro-behaviors, including the selection range of trading partners, actual output, labor requirement, price of intermediate products and employee salaries, increase of all these parameters will lead to higher systemic risk.

  1. The Number of Patients and Events Required to Limit the Risk of Overestimation of Intervention Effects in Meta-Analysis—A Simulation Study

    PubMed Central

    Thorlund, Kristian; Imberger, Georgina; Walsh, Michael; Chu, Rong; Gluud, Christian; Wetterslev, Jørn; Guyatt, Gordon; Devereaux, Philip J.; Thabane, Lehana

    2011-01-01

    Background Meta-analyses including a limited number of patients and events are prone to yield overestimated intervention effect estimates. While many assume bias is the cause of overestimation, theoretical considerations suggest that random error may be an equal or more frequent cause. The independent impact of random error on meta-analyzed intervention effects has not previously been explored. It has been suggested that surpassing the optimal information size (i.e., the required meta-analysis sample size) provides sufficient protection against overestimation due to random error, but this claim has not yet been validated. Methods We simulated a comprehensive array of meta-analysis scenarios where no intervention effect existed (i.e., relative risk reduction (RRR) = 0%) or where a small but possibly unimportant effect existed (RRR = 10%). We constructed different scenarios by varying the control group risk, the degree of heterogeneity, and the distribution of trial sample sizes. For each scenario, we calculated the probability of observing overestimates of RRR>20% and RRR>30% for each cumulative 500 patients and 50 events. We calculated the cumulative number of patients and events required to reduce the probability of overestimation of intervention effect to 10%, 5%, and 1%. We calculated the optimal information size for each of the simulated scenarios and explored whether meta-analyses that surpassed their optimal information size had sufficient protection against overestimation of intervention effects due to random error. Results The risk of overestimation of intervention effects was usually high when the number of patients and events was small and this risk decreased exponentially over time as the number of patients and events increased. The number of patients and events required to limit the risk of overestimation depended considerably on the underlying simulation settings. Surpassing the optimal information size generally provided sufficient protection against overestimation. Conclusions Random errors are a frequent cause of overestimation of intervention effects in meta-analyses. Surpassing the optimal information size will provide sufficient protection against overestimation. PMID:22028777

  2. Microbial contamination in poultry chillers estimated by Monte Carlo simulations

    USDA-ARS?s Scientific Manuscript database

    Recent bacterial outbreaks in fresh and processed foods have increased awareness of food safety among consumers, regulatory agencies, and the food industry. The risk of contamination exists in meat processing facilities where bacteria that are normally associated with the animal are transferred to t...

  3. POLICY ISSUES ASSOCIATED WITH USING SIMULATION TO ASSESS ENVIRONMENTAL IMPACTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uchitel, Kirsten; Tanana, Heather

    This report examines the relationship between simulation-based science and judicial assessments of simulations or models supporting evaluations of environmental harms or risks, considering both how it exists currently and how it might be shaped in the future. This report considers the legal standards relevant to judicial assessments of simulation-based science and provides examples of the judicial application of those legal standards. Next, this report discusses the factors that inform whether there is a correlation between the sophistication of a challenged simulation and judicial support for that simulation. Finally, this report examines legal analysis of the broader issues that must bemore » addressed for simulation-based science to be better understood and utilized in the context of judicial challenge and evaluation. !« less

  4. Benefit-cost estimation for alternative drinking water maximum contaminant levels

    NASA Astrophysics Data System (ADS)

    Gurian, Patrick L.; Small, Mitchell J.; Lockwood, John R.; Schervish, Mark J.

    2001-08-01

    A simulation model for estimating compliance behavior and resulting costs at U.S. Community Water Suppliers is developed and applied to the evaluation of a more stringent maximum contaminant level (MCL) for arsenic. Probability distributions of source water arsenic concentrations are simulated using a statistical model conditioned on system location (state) and source water type (surface water or groundwater). This model is fit to two recent national surveys of source waters, then applied with the model explanatory variables for the population of U.S. Community Water Suppliers. Existing treatment types and arsenic removal efficiencies are also simulated. Utilities with finished water arsenic concentrations above the proposed MCL are assumed to select the least cost option compatible with their existing treatment from among 21 available compliance strategies and processes for meeting the standard. Estimated costs and arsenic exposure reductions at individual suppliers are aggregated to estimate the national compliance cost, arsenic exposure reduction, and resulting bladder cancer risk reduction. Uncertainties in the estimates are characterized based on uncertainties in the occurrence model parameters, existing treatment types, treatment removal efficiencies, costs, and the bladder cancer dose-response function for arsenic.

  5. High-dose-rate prostate brachytherapy inverse planning on dose-volume criteria by simulated annealing.

    PubMed

    Deist, T M; Gorissen, B L

    2016-02-07

    High-dose-rate brachytherapy is a tumor treatment method where a highly radioactive source is brought in close proximity to the tumor. In this paper we develop a simulated annealing algorithm to optimize the dwell times at preselected dwell positions to maximize tumor coverage under dose-volume constraints on the organs at risk. Compared to existing algorithms, our algorithm has advantages in terms of speed and objective value and does not require an expensive general purpose solver. Its success mainly depends on exploiting the efficiency of matrix multiplication and a careful selection of the neighboring states. In this paper we outline its details and make an in-depth comparison with existing methods using real patient data.

  6. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Parker, Keith R.; Murphy, Stephen M.; Day, Robert H.; Bence, A. Edward; Neff, Jerry M.; Wiens, John A.

    2012-01-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001–2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400–4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  7. Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Reich, M.

    Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.

  8. Nuclear Power Plant Cyber Security Discrete Dynamic Event Tree Analysis (LDRD 17-0958) FY17 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheeler, Timothy A.; Denman, Matthew R.; Williams, R. A.

    Instrumentation and control of nuclear power is transforming from analog to modern digital assets. These control systems perform key safety and security functions. This transformation is occurring in new plant designs as well as in the existing fleet of plants as the operation of those plants is extended to 60 years. This transformation introduces new and unknown issues involving both digital asset induced safety issues and security issues. Traditional nuclear power risk assessment tools and cyber security assessment methods have not been modified or developed to address the unique nature of cyber failure modes and of cyber security threat vulnerabilities.more » iii This Lab-Directed Research and Development project has developed a dynamic cyber-risk in- formed tool to facilitate the analysis of unique cyber failure modes and the time sequencing of cyber faults, both malicious and non-malicious, and impose those cyber exploits and cyber faults onto a nuclear power plant accident sequence simulator code to assess how cyber exploits and cyber faults could interact with a plants digital instrumentation and control (DI&C) system and defeat or circumvent a plants cyber security controls. This was achieved by coupling an existing Sandia National Laboratories nuclear accident dynamic simulator code with a cyber emulytics code to demonstrate real-time simulation of cyber exploits and their impact on automatic DI&C responses. Studying such potential time-sequenced cyber-attacks and their risks (i.e., the associated impact and the associated degree of difficulty to achieve the attack vector) on accident management establishes a technical risk informed framework for developing effective cyber security controls for nuclear power.« less

  9. Comparing biomarkers as principal surrogate endpoints.

    PubMed

    Huang, Ying; Gilbert, Peter B

    2011-12-01

    Recently a new definition of surrogate endpoint, the "principal surrogate," was proposed based on causal associations between treatment effects on the biomarker and on the clinical endpoint. Despite its appealing interpretation, limited research has been conducted to evaluate principal surrogates, and existing methods focus on risk models that consider a single biomarker. How to compare principal surrogate value of biomarkers or general risk models that consider multiple biomarkers remains an open research question. We propose to characterize a marker or risk model's principal surrogate value based on the distribution of risk difference between interventions. In addition, we propose a novel summary measure (the standardized total gain) that can be used to compare markers and to assess the incremental value of a new marker. We develop a semiparametric estimated-likelihood method to estimate the joint surrogate value of multiple biomarkers. This method accommodates two-phase sampling of biomarkers and is more widely applicable than existing nonparametric methods by incorporating continuous baseline covariates to predict the biomarker(s), and is more robust than existing parametric methods by leaving the error distribution of markers unspecified. The methodology is illustrated using a simulated example set and a real data set in the context of HIV vaccine trials. © 2011, The International Biometric Society.

  10. The dynamic influence of human resources on evidence-based intervention sustainability and population outcomes: an agent-based modeling approach.

    PubMed

    McKay, Virginia R; Hoffer, Lee D; Combs, Todd B; Margaret Dolcini, M

    2018-06-05

    Sustaining evidence-based interventions (EBIs) is an ongoing challenge for dissemination and implementation science in public health and social services. Characterizing the relationship among human resource capacity within an agency and subsequent population outcomes is an important step to improving our understanding of how EBIs are sustained. Although human resource capacity and population outcomes are theoretically related, examining them over time within real-world experiments is difficult. Simulation approaches, especially agent-based models, offer advantages that complement existing methods. We used an agent-based model to examine the relationships among human resources, EBI delivery, and population outcomes by simulating provision of an EBI through a hypothetical agency and its staff. We used data from existing studies examining a widely implemented HIV prevention intervention to inform simulation design, calibration, and validity. Once we developed a baseline model, we used the model as a simulated laboratory by systematically varying three human resource variables: the number of staff positions, the staff turnover rate, and timing in training. We tracked the subsequent influence on EBI delivery and the level of population risk over time to describe the overall and dynamic relationships among these variables. Higher overall levels of human resource capacity at an agency (more positions) led to more extensive EBI delivery over time and lowered population risk earlier in time. In simulations representing the typical human resource investments, substantial influences on population risk were visible after approximately 2 years and peaked around 4 years. Human resources, especially staff positions, have an important impact on EBI sustainability and ultimately population health. A minimum level of human resources based on the context (e.g., size of the initial population and characteristics of the EBI) is likely needed for an EBI to have a meaningful impact on population outcomes. Furthermore, this model demonstrates how ABMs may be leveraged to inform research design and assess the impact of EBI sustainability in practice.

  11. A Short-Term Population Model of the Suicide Risk: The Case of Spain.

    PubMed

    De la Poza, Elena; Jódar, Lucas

    2018-06-14

    A relevant proportion of deaths by suicide have been attributed to other causes that produce the number of suicides remains hidden. The existence of a hidden number of cases is explained by the nature of the problem. Problems like this involve violence, and produce fear and social shame in victims' families. The existence of violence, fear and social shame experienced by victims favours a considerable number of suicides, identified as accidents or natural deaths. This paper proposes a short time discrete compartmental mathematical model to measure the suicidal risk for the case of Spain. The compartment model classifies and quantifies the amount of the Spanish population within the age intervals (16, 78) by their degree of suicide risk and their changes over time. Intercompartmental transits are due to the combination of quantitative and qualitative factors. Results are computed and simulations are performed to analyze the sensitivity of the model under uncertain coefficients.

  12. Coastal Tsunami and Risk Assessment for Eastern Mediterranean Countries

    NASA Astrophysics Data System (ADS)

    Kentel, E.; Yavuz, C.

    2017-12-01

    Tsunamis are rarely experienced events that have enormous potential to cause large economic destruction on the critical infrastructures and facilities, social devastation due to mass casualty, and environmental adverse effects like erosion, accumulation and inundation. Especially for the past two decades, nations have encountered devastating tsunami events. The aim of this study is to investigate risks along the Mediterranean coastline due to probable tsunamis based on simulations using reliable historical data. In order to do this, 50 Critical Regions, CRs, (i.e. city centers, agricultural areas and summer villages) and 43 Critical Infrastructures, CIs, (i.e. airports, ports & marinas and industrial structures) are determined to perform people-centered risk assessment along Eastern Mediterranean region covering 7 countries. These countries include Turkey, Syria, Lebanon, Israel, Egypt, Cyprus, and Libya. Bathymetry of the region is given in Figure 1. In this study, NAMI-DANCE is used to carry out tsunami simulations. Source of a sample tsunami simulation and maximum wave propagation in the study area for this sample tsunami are given in Figures 2 and 3, respectively.Richter magnitude,, focal depth, time of occurrence in a day and season are considered as the independent parameters of the earthquake. Historical earthquakes are used to generate reliable probability distributions for these parameters. Monte Carlo (MC) Simulations are carried out to evaluate overall risks at the coastline. Inundation level, population density, number of passenger or employee, literacy rate, annually income level and existence of human are used in risk estimations. Within each MC simulation and for each grid in the study area, people-centered tsunami risk for each of the following elements at risk is calculated: i. City centers ii. Agricultural areas iii. Summer villages iv. Ports and marinas v. Airports vi. Industrial structures Risk levels at each grid along the shoreline are calculated based on the factors given above, grouped into low, medium and high risk, and used in generating the risk map. The risk map will be useful in prioritizing areas that require development of tsunami mitigation measures.

  13. A Bayesian method for using simulator data to enhance human error probabilities assigned by existing HRA methods

    DOE PAGES

    Groth, Katrina M.; Smith, Curtis L.; Swiler, Laura P.

    2014-04-05

    In the past several years, several international agencies have begun to collect data on human performance in nuclear power plant simulators [1]. This data provides a valuable opportunity to improve human reliability analysis (HRA), but there improvements will not be realized without implementation of Bayesian methods. Bayesian methods are widely used in to incorporate sparse data into models in many parts of probabilistic risk assessment (PRA), but Bayesian methods have not been adopted by the HRA community. In this article, we provide a Bayesian methodology to formally use simulator data to refine the human error probabilities (HEPs) assigned by existingmore » HRA methods. We demonstrate the methodology with a case study, wherein we use simulator data from the Halden Reactor Project to update the probability assignments from the SPAR-H method. The case study demonstrates the ability to use performance data, even sparse data, to improve existing HRA methods. Furthermore, this paper also serves as a demonstration of the value of Bayesian methods to improve the technical basis of HRA.« less

  14. Impact of task design on task performance and injury risk: case study of a simulated drilling task.

    PubMed

    Alabdulkarim, Saad; Nussbaum, Maury A; Rashedi, Ehsan; Kim, Sunwook; Agnew, Michael; Gardner, Richard

    2017-06-01

    Existing evidence is limited regarding the influence of task design on performance and ergonomic risk, or the association between these two outcomes. In a controlled experiment, we constructed a mock fuselage to simulate a drilling task common in aircraft manufacturing, and examined the effect of three levels of workstation adjustability on performance as measured by productivity (e.g. fuselage completion time) and quality (e.g. fuselage defective holes), and ergonomic risk as quantified using two common methods (rapid upper limb assessment and the strain index). The primary finding was that both productivity and quality significantly improved with increased adjustability, yet this occurred only when that adjustability succeeded in reducing ergonomic risk. Supporting the inverse association between ergonomic risk and performance, the condition with highest adjustability created the lowest ergonomic risk and the best performance while there was not a substantial difference in ergonomic risk between the other two conditions, in which performance was also comparable. Practitioner Summary: Findings of this study supported a causal relationship between task design and both ergonomic risk and performance, and that ergonomic risk and performance are inversely associated. While future work is needed under more realistic conditions and a broader population, these results may be useful for task (re)design and to help cost-justify some ergonomic interventions.

  15. Boomerang: A method for recursive reclassification.

    PubMed

    Devlin, Sean M; Ostrovnaya, Irina; Gönen, Mithat

    2016-09-01

    While there are many validated prognostic classifiers used in practice, often their accuracy is modest and heterogeneity in clinical outcomes exists in one or more risk subgroups. Newly available markers, such as genomic mutations, may be used to improve the accuracy of an existing classifier by reclassifying patients from a heterogenous group into a higher or lower risk category. The statistical tools typically applied to develop the initial classifiers are not easily adapted toward this reclassification goal. In this article, we develop a new method designed to refine an existing prognostic classifier by incorporating new markers. The two-stage algorithm called Boomerang first searches for modifications of the existing classifier that increase the overall predictive accuracy and then merges to a prespecified number of risk groups. Resampling techniques are proposed to assess the improvement in predictive accuracy when an independent validation data set is not available. The performance of the algorithm is assessed under various simulation scenarios where the marker frequency, degree of censoring, and total sample size are varied. The results suggest that the method selects few false positive markers and is able to improve the predictive accuracy of the classifier in many settings. Lastly, the method is illustrated on an acute myeloid leukemia data set where a new refined classifier incorporates four new mutations into the existing three category classifier and is validated on an independent data set. © 2016, The International Biometric Society.

  16. Boomerang: A Method for Recursive Reclassification

    PubMed Central

    Devlin, Sean M.; Ostrovnaya, Irina; Gönen, Mithat

    2016-01-01

    Summary While there are many validated prognostic classifiers used in practice, often their accuracy is modest and heterogeneity in clinical outcomes exists in one or more risk subgroups. Newly available markers, such as genomic mutations, may be used to improve the accuracy of an existing classifier by reclassifying patients from a heterogenous group into a higher or lower risk category. The statistical tools typically applied to develop the initial classifiers are not easily adapted towards this reclassification goal. In this paper, we develop a new method designed to refine an existing prognostic classifier by incorporating new markers. The two-stage algorithm called Boomerang first searches for modifications of the existing classifier that increase the overall predictive accuracy and then merges to a pre-specified number of risk groups. Resampling techniques are proposed to assess the improvement in predictive accuracy when an independent validation data set is not available. The performance of the algorithm is assessed under various simulation scenarios where the marker frequency, degree of censoring, and total sample size are varied. The results suggest that the method selects few false positive markers and is able to improve the predictive accuracy of the classifier in many settings. Lastly, the method is illustrated on an acute myeloid leukemia dataset where a new refined classifier incorporates four new mutations into the existing three category classifier and is validated on an independent dataset. PMID:26754051

  17. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    USGS Publications Warehouse

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  18. A Quantitative Risk Assessment Model Involving Frequency and Threat Degree under Line-of-Business Services for Infrastructure of Emerging Sensor Networks.

    PubMed

    Jing, Xu; Hu, Hanwen; Yang, Huijun; Au, Man Ho; Li, Shuqin; Xiong, Naixue; Imran, Muhammad; Vasilakos, Athanasios V

    2017-03-21

    The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider's server contains a lot of valuable resources. LoBSs' users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs' risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs' risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing.

  19. A Quantitative Risk Assessment Model Involving Frequency and Threat Degree under Line-of-Business Services for Infrastructure of Emerging Sensor Networks

    PubMed Central

    Jing, Xu; Hu, Hanwen; Yang, Huijun; Au, Man Ho; Li, Shuqin; Xiong, Naixue; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider’s server contains a lot of valuable resources. LoBSs’ users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs’ risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs’ risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing. PMID:28335569

  20. Bioaccessibility of Pb from ammunition in game meat is affected by cooking treatment.

    PubMed

    Mateo, Rafael; Baos, Ana R; Vidal, Dolors; Camarero, Pablo R; Martinez-Haro, Monica; Taggart, Mark A

    2011-01-14

    The presence of lead (Pb) ammunition residues in game meat has been widely documented, yet little information exists regarding the bioaccessibility of this Pb contamination. We study how cooking treatment (recipe) can affect Pb bioaccessibility in meat of animals hunted with Pb ammunition. We used an in vitro gastrointestinal simulation to study bioaccessibility. The simulation was applied to meat from red-legged partridge (Alectoris rufa) hunted with Pb shot pellets and cooked using various traditional Spanish game recipes involving wine or vinegar. Total Pb concentrations in the meat were higher in samples with visible Pb ammunition by X-ray (mean±SE: 3.29±1.12 µg/g w.w.) than in samples without this evidence (1.28±0.61 µg/g). The percentage of Pb that was bioaccessible within the simulated intestine phase was far higher in meat cooked with vinegar (6.75%) and wine (4.51%) than in uncooked meat (0.7%). Risk assessment simulations using our results transformed to bioavailability and the Integrated Exposure Uptake Biokinetic model (IEUBK; US EPA) show that the use of wine instead of vinegar in cooking recipes may reduce the percentage of children that would be expected to have >10 µg/dl of Pb in blood from 2.08% to 0.26% when game meat represents 50% of the meat in diet. Lead from ammunition in game meat is more bioaccessible after cooking, especially when using highly acidic recipes. These results are important because existing theoretical models regarding Pb uptake and subsequent risk in humans should take such factors into account.

  1. Predictive study on the risk of malaria spreading due to global warming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ono, Masaji

    Global warming will bring about a temperature elevation, and the habitat of vectors of infectious diseases, such as malaria and dengue fever, will spread into subtropical or temperate zone. The purpose of this study is to simulate the spreading of these diseases through reexamination of existing data and collection of some additional information by field survey. From these data, the author will establish the relationship between meteorological conditions, vector density and malaria occurrence. And then he will simulate and predict the malaria epidemics in case of temperature elevation in southeast Asia and Japan.

  2. MACHETE: Environment for Space Networking Evaluation

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John S.; Woo, Simon

    2010-01-01

    Space Exploration missions requires the design and implementation of space networking that differs from terrestrial networks. In a space networking architecture, interplanetary communication protocols need to be designed, validated and evaluated carefully to support different mission requirements. As actual systems are expensive to build, it is essential to have a low cost method to validate and verify mission/system designs and operations. This can be accomplished through simulation. Simulation can aid design decisions where alternative solutions are being considered, support trade-studies and enable fast study of what-if scenarios. It can be used to identify risks, verify system performance against requirements, and as an initial test environment as one moves towards emulation and actual hardware implementation of the systems. We describe the development of Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) and its use cases in supporting architecture trade studies, protocol performance and its role in hybrid simulation/emulation. The MACHETE environment contains various tools and interfaces such that users may select the set of tools tailored for the specific simulation end goal. The use cases illustrate tool combinations for simulating space networking in different mission scenarios. This simulation environment is useful in supporting space networking design for planned and future missions as well as evaluating performance of existing networks where non-determinism exist in data traffic and/or link conditions.

  3. Assessment of the Impacts of ACLS on the ISS Life Support System Using Dynamic Simulations in V-HAB

    NASA Technical Reports Server (NTRS)

    Putz, Daniel; Olthoff, Claas; Ewert, Michael; Anderson, Molly

    2016-01-01

    The Advanced Closed Loop System (ACLS) is currently under development by Airbus Defense and Space and is slated for launch to the International Space Station (ISS) in 2017. The addition of new hardware into an already complex system such as the ISS life support system (LSS) always poses operational risks. It is therefore important to understand the impacts ACLS will have on the existing systems to ensure smooth operations for the ISS. This analysis can be done by using dynamic computer simulations and one possible tool for such a simulation is the Virtual Habitat (V-HAB). Based on MATLAB, V-HAB has been under development at the Institute of Astronautics of the Technical University of Munich (TUM) since 2004 and in the past has been successfully used to simulate the ISS life support systems. The existing V-HAB ISS simulation model treated the interior volume of the space station as one large, ideally-stirred container. This model was improved to allow the calculation of the atmospheric composition inside individual modules of the ISS by splitting it into twelve distinct volumes. The virtual volumes are connected by a simulation of the inter-module ventilation flows. This allows for a combined simulation of the LSS hardware and the atmospheric composition aboard the ISS. A dynamic model of ACLS is added to the ISS Simulation and several different operating modes for both ACLS and the existing ISS life support systems are studied and the impacts of ACLS on the rest of the system are determined. The results suggest that the US, Russian and ACLS CO2 systems can operate at the same time without impeding each other. Furthermore, based on the results of this analysis, the US and ACLS Sabatier systems can be operated in parallel as well to a achieve a very low CO2 concentration in the cabin atmosphere.

  4. Assessment of the Impacts of ACLS on the ISS Life Support System using Dynamic Simulations in V-HAB

    NASA Technical Reports Server (NTRS)

    Puetz, Daniel; Olthoff, Claas; Ewert, Michael K.; Anderson, Molly S.

    2016-01-01

    The Advanced Closed Loop System (ACLS) is currently under development by Airbus Defense and Space and is slated for launch to the International Space Station (ISS) in 2017. The addition of new hardware into an already complex system such as the ISS life support system (LSS) always poses operational risks. It is therefore important to understand the impacts ACLS will have on the existing systems to ensure smooth operations for the ISS. This analysis can be done by using dynamic computer simulations and one possible tool for such a simulation is Virtual Habitat (V-HAB). Based on Matlab (Registered Trademark) V-HAB has been under development at the Institute of Astronautics of the Technical University Munich (TUM) since 2006 and in the past has been successfully used to simulate the ISS life support systems. The existing V-HAB ISS simulation model treated the interior volume of the space station as one large ideally-stirred container. This model was improved to allow the calculation of the atmospheric composition inside the individual modules of the ISS by splitting it into ten distinct volumes. The virtual volumes are connected by a simulation of the inter-module ventilation flows. This allows for a combined simulation of the LSS hardware and the atmospheric composition aboard the ISS. A dynamic model of ACLS is added to the ISS simulation and different operating modes for both ACLS and the existing ISS life support systems are studied to determine the impacts of ACLS on the rest of the system. The results suggest that the US, Russian and ACLS CO2 systems can operate at the same time without impeding each other. Furthermore, based on the results of this analysis, the US and ACLS Sabatier systems can be operated in parallel as well to achieve the highest possible CO2 recycling together with a low CO2 concentration.

  5. Driving-forces model on individual behavior in scenarios considering moving threat agents

    NASA Astrophysics Data System (ADS)

    Li, Shuying; Zhuang, Jun; Shen, Shifei; Wang, Jia

    2017-09-01

    The individual behavior model is a contributory factor to improve the accuracy of agent-based simulation in different scenarios. However, few studies have considered moving threat agents, which often occur in terrorist attacks caused by attackers with close-range weapons (e.g., sword, stick). At the same time, many existing behavior models lack validation from cases or experiments. This paper builds a new individual behavior model based on seven behavioral hypotheses. The driving-forces model is an extension of the classical social force model considering scenarios including moving threat agents. An experiment was conducted to validate the key components of the model. Then the model is compared with an advanced Elliptical Specification II social force model, by calculating the fitting errors between the simulated and experimental trajectories, and being applied to simulate a specific circumstance. Our results show that the driving-forces model reduced the fitting error by an average of 33.9% and the standard deviation by an average of 44.5%, which indicates the accuracy and stability of the model in the studied situation. The new driving-forces model could be used to simulate individual behavior when analyzing the risk of specific scenarios using agent-based simulation methods, such as risk analysis of close-range terrorist attacks in public places.

  6. Using simulation for interventional radiology training

    PubMed Central

    Gould, D

    2010-01-01

    Debate on the existence of innate skills has all but evaporated in the light of evidence that it is only the hours spent in deliberate practice that correlate with even the most elite levels of expertise. A range of simple to advanced technologies stands to address some of the many challenges to effective training of 21st century, procedural medicine. Simulation could train and assess behaviours remotely from patients, in complete safety, reducing the risks of inexperienced trainees learning critical tasks in patients while contributing to certification and revalidation. Understanding the strengths and limitations of these devices, determining and improving their effectiveness and identifying their roles, as well as those of individuals and teams, represents a cornerstone of successful adoption into the interventional radiology curriculum. This requires a simulation strategy that includes standards for simulator documentation. PMID:20603407

  7. Bioaccessibility of Pb from Ammunition in Game Meat Is Affected by Cooking Treatment

    PubMed Central

    Mateo, Rafael; Baos, Ana R.; Vidal, Dolors; Camarero, Pablo R.; Martinez-Haro, Monica; Taggart, Mark A.

    2011-01-01

    Background The presence of lead (Pb) ammunition residues in game meat has been widely documented, yet little information exists regarding the bioaccessibility of this Pb contamination. We study how cooking treatment (recipe) can affect Pb bioaccessibility in meat of animals hunted with Pb ammunition. Methodology/Principal Findings We used an in vitro gastrointestinal simulation to study bioaccessibility. The simulation was applied to meat from red-legged partridge (Alectoris rufa) hunted with Pb shot pellets and cooked using various traditional Spanish game recipes involving wine or vinegar. Total Pb concentrations in the meat were higher in samples with visible Pb ammunition by X-ray (mean±SE: 3.29±1.12 µg/g w.w.) than in samples without this evidence (1.28±0.61 µg/g). The percentage of Pb that was bioaccessible within the simulated intestine phase was far higher in meat cooked with vinegar (6.75%) and wine (4.51%) than in uncooked meat (0.7%). Risk assessment simulations using our results transformed to bioavailability and the Integrated Exposure Uptake Biokinetic model (IEUBK; US EPA) show that the use of wine instead of vinegar in cooking recipes may reduce the percentage of children that would be expected to have >10 µg/dl of Pb in blood from 2.08% to 0.26% when game meat represents 50% of the meat in diet. Conclusions/Significance Lead from ammunition in game meat is more bioaccessible after cooking, especially when using highly acidic recipes. These results are important because existing theoretical models regarding Pb uptake and subsequent risk in humans should take such factors into account. PMID:21264290

  8. Comparison of robustness to outliers between robust poisson models and log-binomial models when estimating relative risks for common binary outcomes: a simulation study.

    PubMed

    Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P

    2014-06-26

    To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.

  9. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    NASA Astrophysics Data System (ADS)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.

  10. Lean Development with the Morpheus Simulation Software

    NASA Technical Reports Server (NTRS)

    Brogley, Aaron C.

    2013-01-01

    The Morpheus project is an autonomous robotic testbed currently in development at NASA's Johnson Space Center (JSC) with support from other centers. Its primary objectives are to test new 'green' fuel propulsion systems and to demonstrate the capability of the Autonomous Lander Hazard Avoidance Technology (ALHAT) sensor, provided by the Jet Propulsion Laboratory (JPL) on a lunar landing trajectory. If successful, these technologies and lessons learned from the Morpheus testing cycle may be incorporated into a landing descent vehicle used on the moon, an asteroid, or Mars. In an effort to reduce development costs and cycle time, the project employs lean development engineering practices in its development of flight and simulation software. The Morpheus simulation makes use of existing software packages where possible to reduce the development time. The development and testing of flight software occurs primarily through the frequent test operation of the vehicle and incrementally increasing the scope of the test. With rapid development cycles, risk of loss of the vehicle and loss of the mission are possible, but efficient progress in development would not be possible without that risk.

  11. Assessing influences on social vulnerability to wildfire using surveys, spatial data and wildfire simulations.

    PubMed

    Paveglio, Travis B; Edgeley, Catrin M; Stasiewicz, Amanda M

    2018-05-01

    A growing body of research focuses on identifying patterns among human populations most at risk from hazards such as wildfire and the factors that help explain performance of mitigations that can help reduce that risk. Emerging policy surrounding wildfire management emphasizes the need to better understand such social vulnerability-or human populations' potential exposure to and sensitivity from wildfire-related impacts, including their ability to reduce negative impacts from the hazard. Studies of social vulnerability to wildfire often pair secondary demographic data with a variety of vegetation and wildfire simulation models to map potential risk. However, many of the assumptions made by those researchers about the demographic, spatial or perceptual factors that influence social vulnerability to wildfire have not been fully evaluated or tested against objective measures of potential wildfire risk. The research presented here utilizes self-reported surveys, GIS data, and wildfire simulations to test the relationships between select perceptual, demographic, and property characteristics of property owners against empirically simulated metrics for potential wildfire related damages or exposure. We also evaluate how those characteristics relate to property owners' performance of mitigations or support for fire management. Our results suggest that parcel characteristics provide the most significant explanation of variability in wildfire exposure, sensitivity and overall wildfire risk, while the positive relationship between income or property values and components of social vulnerability stands in contrast to typical assumptions from existing literature. Respondents' views about agency or government management helped explain a significant amount of variance in wildfire sensitivity, while the importance of wildfire risk in selecting a residence was an important influence on mitigation action. We use these and other results from our effort to discuss updated considerations for determining social vulnerability to wildfire and articulate alternative means to collect such information. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Flight Validation of Mars Mission Technologies

    NASA Technical Reports Server (NTRS)

    Eberspeaker, P. J.

    2000-01-01

    Effective exploration and characterization of Mars will require the deployment of numerous surface probes, tethered balloon stations and free-flying balloon systems as well as larger landers and orbiting satellite systems. Since launch opportunities exist approximately every two years it is extremely critical that each and every mission maximize its potential for success. This will require significant testing of each system in an environment that simulates the actual operational environment as closely as possible. Analytical techniques and laboratory testing goes a long way in mitigating the inherent risks associated with space exploration, however they fall sort of accurately simulating the unpredictable operational environment in which these systems must function.

  13. Efficient Maximum Likelihood Estimation for Pedigree Data with the Sum-Product Algorithm.

    PubMed

    Engelhardt, Alexander; Rieger, Anna; Tresch, Achim; Mansmann, Ulrich

    2016-01-01

    We analyze data sets consisting of pedigrees with age at onset of colorectal cancer (CRC) as phenotype. The occurrence of familial clusters of CRC suggests the existence of a latent, inheritable risk factor. We aimed to compute the probability of a family possessing this risk factor as well as the hazard rate increase for these risk factor carriers. Due to the inheritability of this risk factor, the estimation necessitates a costly marginalization of the likelihood. We propose an improved EM algorithm by applying factor graphs and the sum-product algorithm in the E-step. This reduces the computational complexity from exponential to linear in the number of family members. Our algorithm is as precise as a direct likelihood maximization in a simulation study and a real family study on CRC risk. For 250 simulated families of size 19 and 21, the runtime of our algorithm is faster by a factor of 4 and 29, respectively. On the largest family (23 members) in the real data, our algorithm is 6 times faster. We introduce a flexible and runtime-efficient tool for statistical inference in biomedical event data with latent variables that opens the door for advanced analyses of pedigree data. © 2017 S. Karger AG, Basel.

  14. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Technical Reports Server (NTRS)

    Lawler, Dennis G.

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition,more » substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.« less

  16. Using driving simulators to assess driving safety.

    PubMed

    Boyle, Linda Ng; Lee, John D

    2010-05-01

    Changes in drivers, vehicles, and roadways pose substantial challenges to the transportation safety community. Crash records and naturalistic driving data are useful for examining the influence of past or existing technology on drivers, and the associations between risk factors and crashes. However, they are limited because causation cannot be established and technology not yet installed in production vehicles cannot be assessed. Driving simulators have become an increasingly widespread tool to understand evolving and novel technologies. The ability to manipulate independent variables in a randomized, controlled setting also provides the added benefit of identifying causal links. This paper introduces a special issue on simulator-based safety studies. The special issue comprises 25 papers that demonstrate the use of driving simulators to address pressing transportation safety problems and includes topics as diverse as neurological dysfunction, work zone design, and driver distraction. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  17. Estimating the Economic Benefits of Eliminating Job Strain as a Risk Factor for Depression.

    PubMed

    Cocker, Fiona; Sanderson, Kristy; LaMontagne, Anthony D

    2017-01-01

    The aim of this study was to quantify the economic benefits of eliminating job strain as a risk factor for depression, using published population-attributable risk estimates of depression attributable to job strain (13.2% for men, 17.2% for women). Cohort simulation using state-transition Markov modeling estimated costs and health outcomes for employed persons who met criteria for lifetime DSM-IV major depression. A societal perspective over 1-year and lifetime time horizons was used. Among employed Australians, $890 million (5.8%) of the annual societal cost of depression was attributable to job strain. Employers bore the brunt of these costs, as they arose from lost productive time and increased risk of job turnover among employees experiencing depression. Proven, practicable means exist to reduce job strain. The findings demonstrate likely financial benefits to employers for expanding psychosocial risk management, providing a financial incentive to complement and reinforce legal and ethical directives.

  18. Study and Simulation on Dynamics of a Risk-Averse Supply Chain Pricing Model with Dual-Channel and Incomplete Information

    NASA Astrophysics Data System (ADS)

    Sun, Lijian; Ma, Junhai

    Under the industrial background of dual-channel, volatility in demand of consumers, we use the theory of bifurcations and numerical simulation tools to investigate the dynamic pricing game in a dual-channel supply chain with risk-averse behavior and incomplete information. Due to volatility of demand of consumers, we consider all the players in the supply chain are risk-averse. We assume there exist Bertrand game and Manufacturers’ Stackelberg in the chain which are closer to reality. The main objective of the paper is to investigate the complex influence of the decision parameters such as wholesale price adjustment speed, risk preference and service value on stability of the risk-averse supply chain and average utilities of all the players. We lay emphasis on the influence of chaos on average utilities of all the players which did not appear in previous studies. The dynamic phenomena, such as the bifurcation, chaos and sensitivity to initial values are analyzed by 2D bifurcation phase portraits, Double Largest Lyapunov exponent, basins of attraction and so on. The study shows that the manufacturers should slow down their wholesale price adjustment speed to get more utilities, if the manufacturers are willing to take on more risk, they will get more profits, but they must keep their wholesale prices in a certain range in order to maintain the market stability.

  19. Improving risk classification of critical illness with biomarkers: a simulation study

    PubMed Central

    Seymour, Christopher W.; Cooke, Colin R.; Wang, Zheyu; Kerr, Kathleen F.; Yealy, Donald M.; Angus, Derek C.; Rea, Thomas D.; Kahn, Jeremy M.; Pepe, Margaret S.

    2012-01-01

    Purpose Optimal triage of patients at risk of critical illness requires accurate risk prediction, yet little data exists on the performance criteria required of a potential biomarker to be clinically useful. Materials and Methods We studied an adult cohort of non-arrest, non-trauma emergency medical services encounters transported to a hospital from 2002–2006. We simulated hypothetical biomarkers increasingly associated with critical illness during hospitalization, and determined the biomarker strength and sample size necessary to improve risk classification beyond a best clinical model. Results Of 57,647 encounters, 3,121 (5.4%) were hospitalized with critical illness and 54,526 (94.6%) without critical illness. The addition of a moderate strength biomarker (odds ratio=3.0 for critical illness) to a clinical model improved discrimination (c-statistic 0.85 vs. 0.8, p<0.01), reclassification (net reclassification improvement=0.15, 95%CI: 0.13,0.18), and increased the proportion of cases in the highest risk categoryby+8.6% (95%CI: 7.5,10.8%). Introducing correlation between the biomarker and physiological variables in the clinical risk score did not modify the results. Statistically significant changes in net reclassification required a sample size of at least 1000 subjects. Conclusions Clinical models for triage of critical illness could be significantly improved by incorporating biomarkers, yet, substantial sample sizes and biomarker strength may be required. PMID:23566734

  20. Score tests for independence in semiparametric competing risks models.

    PubMed

    Saïd, Mériem; Ghazzali, Nadia; Rivest, Louis-Paul

    2009-12-01

    A popular model for competing risks postulates the existence of a latent unobserved failure time for each risk. Assuming that these underlying failure times are independent is attractive since it allows standard statistical tools for right-censored lifetime data to be used in the analysis. This paper proposes simple independence score tests for the validity of this assumption when the individual risks are modeled using semiparametric proportional hazards regressions. It assumes that covariates are available, making the model identifiable. The score tests are derived for alternatives that specify that copulas are responsible for a possible dependency between the competing risks. The test statistics are constructed by adding to the partial likelihoods for the individual risks an explanatory variable for the dependency between the risks. A variance estimator is derived by writing the score function and the Fisher information matrix for the marginal models as stochastic integrals. Pitman efficiencies are used to compare test statistics. A simulation study and a numerical example illustrate the methodology proposed in this paper.

  1. High-Fidelity Multi-Rotor Unmanned Aircraft System Simulation Development for Trajectory Prediction Under Off-Nominal Flight Dynamics

    NASA Technical Reports Server (NTRS)

    Foster, John V.; Hartman, David C.

    2017-01-01

    The NASA Unmanned Aircraft System (UAS) Traffic Management (UTM) project is conducting research to enable civilian low-altitude airspace and UAS operations. A goal of this project is to develop probabilistic methods to quantify risk during failures and off nominal flight conditions. An important part of this effort is the reliable prediction of feasible trajectories during off-nominal events such as control failure, atmospheric upsets, or navigation anomalies that can cause large deviations from the intended flight path or extreme vehicle upsets beyond the normal flight envelope. Few examples of high-fidelity modeling and prediction of off-nominal behavior for small UAS (sUAS) vehicles exist, and modeling requirements for accurately predicting flight dynamics for out-of-envelope or failure conditions are essentially undefined. In addition, the broad range of sUAS aircraft configurations already being fielded presents a significant modeling challenge, as these vehicles are often very different from one another and are likely to possess dramatically different flight dynamics and resultant trajectories and may require different modeling approaches to capture off-nominal behavior. NASA has undertaken an extensive research effort to define sUAS flight dynamics modeling requirements and develop preliminary high fidelity six degree-of-freedom (6-DOF) simulations capable of more closely predicting off-nominal flight dynamics and trajectories. This research has included a literature review of existing sUAS modeling and simulation work as well as development of experimental testing methods to measure and model key components of propulsion, airframe and control characteristics. The ultimate objective of these efforts is to develop tools to support UTM risk analyses and for the real-time prediction of off-nominal trajectories for use in the UTM Risk Assessment Framework (URAF). This paper focuses on modeling and simulation efforts for a generic quad-rotor configuration typical of many commercial vehicles in use today. An overview of relevant off-nominal multi-rotor behaviors will be presented to define modeling goals and to identify the prediction capability lacking in simplified models of multi-rotor performance. A description of recent NASA wind tunnel testing of multi-rotor propulsion and airframe components will be presented illustrating important experimental and data acquisition methods, and a description of preliminary propulsion and airframe models will be presented. Lastly, examples of predicted off-nominal flight dynamics and trajectories from the simulation will be presented.

  2. Percutaneous dilational tracheostomy (PDT) and prevention of blood aspiration with superimposed high-frequency jet ventilation (SHFJV) using the tracheotomy-endoscope (TED): results of numerical and experimental simulations.

    PubMed

    Nowak, Andreas; Langebach, Robin; Klemm, Eckart; Heller, Winfried

    2012-04-01

    We describe an innovative computer-based method for the analysis of gas flow using a modified airway management technique to perform percutaneous dilatational tracheotomy (PDT) with a rigid tracheotomy endoscope (TED). A test lung was connected via an artificial trachea with the tracheotomy endoscope and ventilated using superimposed high-frequency jet ventilation. Red packed cells were instilled during the puncture phase of a simulated percutaneous tracheotomy in a trachea model and migration of the red packed cells during breathing was continuously measured. Simultaneously, the calculation of the gas-flow within the endoscope was numerically simulated. In the experimental study, no backflow of blood occurred during the use of superimposed high-frequency jet ventilation (SHFJV) from the trachea into the endoscope nor did any transportation of blood into the lower respiratory tract occur. In parallel, the numerical simulations of the openings of TED show almost positive volume flows. Under the conditions investigated there is no risk of blood aspiration during PDT using the TED and simultaneous ventilation with SHFJV. In addition, no risk of impairment of endoscopic visibility exists through a backflow of blood into the TED. The method of numerical simulation offers excellent insight into the fluid flow even under highly transient conditions like jet ventilation.

  3. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Johnson, Charles B.; Garshelis, David L.; Parker, Keith R.

    2010-01-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001–2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194

  4. MASTODON: A geosciences simulation tool built using the open-source framework MOOSE

    NASA Astrophysics Data System (ADS)

    Slaughter, A.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The geosciences community could benefit from existing tools by enabling collaboration between researchers and practitioners throughout the world and advance the state-of-the-art in line with other scientific research efforts.

  5. Support Vector Hazards Machine: A Counting Process Framework for Learning Risk Scores for Censored Outcomes.

    PubMed

    Wang, Yuanjia; Chen, Tianle; Zeng, Donglin

    2016-01-01

    Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.

  6. A Corrosion Risk Assessment Model for Underground Piping

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  7. Numerical aerodynamic simulation facility feasibility study, executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    There were three major issues examined in the feasibility study. First, the ability of the proposed system architecture to support the anticipated workload was evaluated. Second, the throughput of the computational engine (the flow model processor) was studied using real application programs. Third, the availability, reliability, and maintainability of the system were modeled. The evaluations were based on the baseline systems. The results show that the implementation of the Numerical Aerodynamic Simulation Facility, in the form considered, would indeed be a feasible project with an acceptable level of risk. The technology required (both hardware and software) either already exists or, in the case of a few parts, is expected to be announced this year.

  8. A nonlocal spatial model for Lyme disease

    NASA Astrophysics Data System (ADS)

    Yu, Xiao; Zhao, Xiao-Qiang

    2016-07-01

    This paper is devoted to the study of a nonlocal and time-delayed reaction-diffusion model for Lyme disease with a spatially heterogeneous structure. In the case of a bounded domain, we first prove the existence of the positive steady state and a threshold type result for the disease-free system, and then establish the global dynamics for the model system in terms of the basic reproduction number. In the case of an unbound domain, we obtain the existence of the disease spreading speed and its coincidence with the minimal wave speed. At last, we use numerical simulations to verify our analytic results and investigate the influence of model parameters and spatial heterogeneity on the disease infection risk.

  9. Modeling of beam-induced damage of the LHC tertiary collimators

    NASA Astrophysics Data System (ADS)

    Quaranta, E.; Bertarelli, A.; Bruce, R.; Carra, F.; Cerutti, F.; Lechner, A.; Redaelli, S.; Skordis, E.; Gradassi, P.

    2017-09-01

    Modern hadron machines with high beam intensity may suffer from material damage in the case of large beam losses and even beam-intercepting devices, such as collimators, can be harmed. A systematic method to evaluate thresholds of damage owing to the impact of high energy particles is therefore crucial for safe operation and for predicting possible limitations in the overall machine performance. For this, a three-step simulation approach is presented, based on tracking simulations followed by calculations of energy deposited in the impacted material and hydrodynamic simulations to predict the thermomechanical effect of the impact. This approach is applied to metallic collimators at the CERN Large Hadron Collider (LHC), which in standard operation intercept halo protons, but risk to be damaged in the case of extraction kicker malfunction. In particular, tertiary collimators protect the aperture bottlenecks, their settings constrain the reach in β* and hence the achievable luminosity at the LHC experiments. Our calculated damage levels provide a very important input on how close to the beam these collimators can be operated without risk of damage. The results of this approach have been used already to push further the performance of the present machine. The risk of damage is even higher in the upgraded high-luminosity LHC with higher beam intensity, for which we quantify existing margins before equipment damage for the proposed baseline settings.

  10. Cyclic subway networks are less risky in metropolises

    NASA Astrophysics Data System (ADS)

    Xiao, Ying; Zhang, Hai-Tao; Xu, Bowen; Zhu, Tao; Chen, Guanrong; Chen, Duxin

    2018-02-01

    Subways are crucial in modern transportation systems of metropolises. To quantitatively evaluate the potential risks of subway networks suffered from natural disasters or deliberate attacks, real data from seven Chinese subway systems are collected and their population distributions and anti-risk capabilities are analyzed. Counterintuitively, it is found that transfer stations with large numbers of connections are not the most crucial, but the stations and lines with large betweenness centrality are essential, if subway networks are being attacked. It is also found that cycles reduce such correlations due to the existence of alternative paths. To simulate the data-based observations, a network model is proposed to characterize the dynamics of subway systems under various intensities of attacks on stations and lines. This study sheds some light onto risk assessment of subway networks in metropolitan cities.

  11. Manufacture of Lunar Regolith Simulants

    NASA Technical Reports Server (NTRS)

    Rickman, D. L.; Wilson, S. A.; Stoeser, D. B.; Weinstein, M. A.; Edmunson, J. E.

    2013-01-01

    The manufacture of lunar regolith simulants can use many technologies unfamiliar to the aerospace industry. Many of these technologies are extensively used in the mining industry. Rock crushing, grinding, process control as a function of particle size, as well as other essential concepts are explained here. Notes are provided on special considerations necessary, given the unusual nature of the desired final product. For example, wet grinding, which is an industry norm, can alter the behavior of simulant materials. As the geologic materials used for simulants can contain minerals such as quartz and pyrite, guidance is provided regarding concepts, risks, measurement, and handling. Extractive metallurgy can be used to produce high-grade components for subsequent manufacture, reducing the compromises inherent in using just rock. Several of the components needed in simulants such as glasses, agglutinates, and breccias are simply not available or not reasonably matched by existing terrestrial resources. Therefore, techniques to produce these in useful quantities were developed and used. Included in this list is the synthesis of specific minerals. The manufacture of two simulants, NU-LHT-1M and NU-LHT-2M, is covered in detail.

  12. Role of in-situ simulation for training in healthcare: opportunities and challenges.

    PubMed

    Kurup, Viji; Matei, Veronica; Ray, Jessica

    2017-12-01

    Simulation has now been acknowledged as an important part of training in healthcare, and most academic hospitals have a dedicated simulation center. In-situ simulation occurs in patient care units with scenarios involving healthcare professionals in their actual working environment. The purpose of this review is to describe the process of putting together the components of in-situ simulation for training programs and to review outcomes studied, and challenges with this approach. In-situ simulation has been used to 'test-drive' new centers, train personnel in new procedures in existing centers, for recertification training and to uncover latent threats in clinical care areas. It has also emerged as an attractive alternative to traditional simulations for institutions that do not have their own simulation center. In-situ simulation can be used to improve reliability and safety especially in areas of high risk, and in high-stress environments. It is also a reasonable and attractive alternative for programs that want to conduct interdisciplinary simulations for their trainees and faculty, and for those who do not have access to a fully functional simulation center. Further research needs to be done in assessing effectiveness of training using this method and the effect of such training on clinical outcomes.

  13. Analyses in Support of Risk-Informed Natural Gas Vehicle Maintenance Facility Codes and Standards: Phase II.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca

    Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behaviormore » of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.« less

  14. An Investigation of the Electrical Short Circuit Characteristics of Tin Whiskers

    NASA Technical Reports Server (NTRS)

    Courey, Karim J.

    2008-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This model can be used to improve existing risk simulation models FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  15. Evaluation of an avatar-based training program to promote suicide prevention awareness in a college setting.

    PubMed

    Rein, Benjamin A; McNeil, Daniel W; Hayes, Allison R; Hawkins, T Anne; Ng, H Mei; Yura, Catherine A

    2018-07-01

    Training programs exist that prepare college students, faculty, and staff to identify and support students potentially at risk for suicide. Kognito is an online program that trains users through simulated interactions with virtual humans. This study evaluated Kognito's effectiveness in preparing users to intervene with at-risk students. Training was completed by 2,727 university students, faculty, and staff from April, 2014 through September, 2015. Voluntary and mandatory participants at a land-grant university completed Kognito modules designed for higher education, along with pre- and post-assessments. All modules produced significant gains in reported Preparedness, Likelihood, and Self-Efficacy in intervening with troubled students. Despite initial disparities in reported abilities, after training participants reported being similarly capable of assisting at-risk students, including LGBTQ and veteran students. Kognito training appears to be effective, on a large scale, in educating users to act in a facilitative role for at-risk college students.

  16. Effects of protection forests on rockfall risks: implementation in the Swiss risk concept

    NASA Astrophysics Data System (ADS)

    Trappmann, Daniel; Moos, Christine; Fehlmann, Michael; Ernst, Jacqueline; Sandri, Arthur; Dorren, Luuk; Stoffel, Markus

    2016-04-01

    Forests growing on slopes below active rockfall cliffs can provide effective protection for human lives and infrastructures. The risk-based approach for natural hazards in Switzerland shall take such biological measures just like existing technical protective measures into account, provided that certain criteria regarding condition, maintenance and durability are met. This contribution describes a project in which we are investigating how the effects of protection forests can be considered in rockfall risk analyses in an appropriate way. In principle, protection forests reduce rockfall risks in three different ways: (i) reduction of the event magnitude (energy) due to collisions with tree stems; (ii) reduction of frequency of occurrence of a given scenario (block volume arriving at the damage potential); (iii) reduction of spatial probability of occurrence (spread and runout) of a given scenario in case of multiple fragments during one event. The aim of this work is to develop methods for adequately implementing these three effects of rockfall protection forests in risk calculations. To achieve this, we use rockfall simulations taking collisions with trees into account and detailed field validation. On five test sites, detailed knowledge on past rockfall activity is gathered by combining investigations of impacted trees, analysis of documented historical events, and deposits in the field. Based on this empirical data on past rockfalls, a methodology is developed that allows transferring real past rockfall activity to simulation results obtained with the three-dimensional, process-based model Rockyfor3D. Different ways of quantifying the protective role of forests will be considered by comparing simulation results with and without forest cover. Combining these different research approaches, systematic considerations shall lead to the development of methods for adequate inclusion of the protective effects of forests in risk calculations. The applicability of the developed methods will be tested on the case study slopes in order to ensure practical applicability to a broad range of rockfall situations on forested slopes.

  17. How to Decide? Multi-Objective Early-Warning Monitoring Networks for Water Suppliers

    NASA Astrophysics Data System (ADS)

    Bode, Felix; Loschko, Matthias; Nowak, Wolfgang

    2015-04-01

    Groundwater is a resource for drinking water and hence needs to be protected from contaminations. However, many well catchments include an inventory of known and unknown risk sources, which cannot be eliminated, especially in urban regions. As a matter of risk control, all these risk sources should be monitored. A one-to-one monitoring situation for each risk source would lead to a cost explosion and is even impossible for unknown risk sources. However, smart optimization concepts could help to find promising low-cost monitoring network designs. In this work we develop a concept to plan monitoring networks using multi-objective optimization. Our considered objectives are to maximize the probability of detecting all contaminations, to enhance the early warning time before detected contaminations reach the drinking water well, and to minimize the installation and operating costs of the monitoring network. Using multi-objectives optimization, we avoid the problem of having to weight these objectives to a single objective-function. These objectives are clearly competing, and it is impossible to know their mutual trade-offs beforehand - each catchment differs in many points and it is hardly possible to transfer knowledge between geological formations and risk inventories. To make our optimization results more specific to the type of risk inventory in different catchments we do risk prioritization of all known risk sources. Due to the lack of the required data, quantitative risk ranking is impossible. Instead, we use a qualitative risk ranking to prioritize the known risk sources for monitoring. Additionally, we allow for the existence of unknown risk sources that are totally uncertain in location and in their inherent risk. Therefore, they can neither be located nor ranked. Instead, we represent them by a virtual line of risk sources surrounding the production well. We classify risk sources into four different categories: severe, medium and tolerable for known risk sources and an extra category for the unknown ones. With that, early warning time and detection probability become individual objectives for each risk class. Thus, decision makers can identify monitoring networks valid for controlling the top risk sources, and evaluate the capabilities (or search for least-cost upgrades) to also cover moderate, tolerable and unknown risk sources. Monitoring networks, which are valid for the remaining risk also cover all other risk sources, but only with a relatively poor early-warning time. The data provided for the optimization algorithm are calculated in a preprocessing step by a flow and transport model. It simulates, which potential contaminant plumes from the risk sources would be detectable where and when by all possible candidate positions for monitoring wells. Uncertainties due to hydro(geo)logical phenomena are taken into account by Monte-Carlo simulations. These include uncertainty in ambient flow direction of the groundwater, uncertainty of the conductivity field, and different scenarios for the pumping rates of the production wells. To avoid numerical dispersion during the transport simulations, we use particle-tracking random walk methods when simulating transport.

  18. Lipid Adjustment for Chemical Exposures: Accounting for Concomitant Variables

    PubMed Central

    Li, Daniel; Longnecker, Matthew P.; Dunson, David B.

    2013-01-01

    Background Some environmental chemical exposures are lipophilic and need to be adjusted by serum lipid levels before data analyses. There are currently various strategies that attempt to account for this problem, but all have their drawbacks. To address such concerns, we propose a new method that uses Box-Cox transformations and a simple Bayesian hierarchical model to adjust for lipophilic chemical exposures. Methods We compared our Box-Cox method to existing methods. We ran simulation studies in which increasing levels of lipid-adjusted chemical exposure did and did not increase the odds of having a disease, and we looked at both single-exposure and multiple-exposures cases. We also analyzed an epidemiology dataset that examined the effects of various chemical exposures on the risk of birth defects. Results Compared with existing methods, our Box-Cox method produced unbiased estimates, good coverage, similar power, and lower type-I error rates. This was the case in both single- and multiple-exposure simulation studies. Results from analysis of the birth-defect data differed from results using existing methods. Conclusion Our Box-Cox method is a novel and intuitive way to account for the lipophilic nature of certain chemical exposures. It addresses some of the problems with existing methods, is easily extendable to multiple exposures, and can be used in any analyses that involve concomitant variables. PMID:24051893

  19. Study on Market Stability and Price Limit of Chinese Stock Index Futures Market: An Agent-Based Modeling Perspective.

    PubMed

    Xiong, Xiong; Nan, Ding; Yang, Yang; Yongjie, Zhang

    2015-01-01

    This paper explores a method of managing the risk of the stock index futures market and the cross-market through analyzing the effectiveness of price limits on the Chinese Stock Index 300 futures market. We adopt a cross-market artificial financial market (include the stock market and the stock index futures market) as a platform on which to simulate the operation of the CSI 300 futures market by changing the settings of price limits. After comparing the market stability under different price limits by appropriate liquidity and volatility indicators, we find that enhancing price limits or removing price limits both play a negative impact on market stability. In contrast, a positive impact exists on market stability if the existing price limit is maintained (increase of limit by10%, down by 10%) or it is broadened to a proper extent. Our study provides reasonable advice for a price limit setting and risk management for CSI 300 futures.

  20. Study on Market Stability and Price Limit of Chinese Stock Index Futures Market: An Agent-Based Modeling Perspective

    PubMed Central

    2015-01-01

    This paper explores a method of managing the risk of the stock index futures market and the cross-market through analyzing the effectiveness of price limits on the Chinese Stock Index 300 futures market. We adopt a cross-market artificial financial market (include the stock market and the stock index futures market) as a platform on which to simulate the operation of the CSI 300 futures market by changing the settings of price limits. After comparing the market stability under different price limits by appropriate liquidity and volatility indicators, we find that enhancing price limits or removing price limits both play a negative impact on market stability. In contrast, a positive impact exists on market stability if the existing price limit is maintained (increase of limit by10%, down by 10%) or it is broadened to a proper extent. Our study provides reasonable advice for a price limit setting and risk management for CSI 300 futures. PMID:26571135

  1. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  2. Adverse fetal outcome in road accidents: Injury mechanism study and injury criteria development in a pregnant woman finite element model.

    PubMed

    Auriault, F; Thollon, L; Pérès, J; Behr, M

    2016-12-01

    This study documents the development of adverse fetal outcome predictors dedicated to the analysis of road accidents involving pregnant women. To do so, a pre-existing whole body finite element model representative of a 50th percentile 26 weeks pregnant woman was used. A total of 8 accident scenarios were simulated with the model positioned on a sled. Each of these scenarios was associated to a risk of adverse fetal outcome based on results from real car crash investigations involving pregnant women from the literature. The use of airbags and accidents involving unbelted occupants were not considered in this study. Several adverse fetal outcome potential predictors were then evaluated with regard to their correlation to this risk of fetal injuries. Three predictors appeared strongly correlated to the risk of adverse fetal outcome: (1) the intra uterine pressure at the placenta fetal side area (r=0.92), (2) the fetal head acceleration (HIC) (r=0.99) and (3) area of utero-placental interface over a strain threshold (r=0.90). Finally, sensitivity analysis against slight variations of the simulation parameters was performed and assess robustness of these criteria. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Integrative genetic risk prediction using non-parametric empirical Bayes classification.

    PubMed

    Zhao, Sihai Dave

    2017-06-01

    Genetic risk prediction is an important component of individualized medicine, but prediction accuracies remain low for many complex diseases. A fundamental limitation is the sample sizes of the studies on which the prediction algorithms are trained. One way to increase the effective sample size is to integrate information from previously existing studies. However, it can be difficult to find existing data that examine the target disease of interest, especially if that disease is rare or poorly studied. Furthermore, individual-level genotype data from these auxiliary studies are typically difficult to obtain. This article proposes a new approach to integrative genetic risk prediction of complex diseases with binary phenotypes. It accommodates possible heterogeneity in the genetic etiologies of the target and auxiliary diseases using a tuning parameter-free non-parametric empirical Bayes procedure, and can be trained using only auxiliary summary statistics. Simulation studies show that the proposed method can provide superior predictive accuracy relative to non-integrative as well as integrative classifiers. The method is applied to a recent study of pediatric autoimmune diseases, where it substantially reduces prediction error for certain target/auxiliary disease combinations. The proposed method is implemented in the R package ssa. © 2016, The International Biometric Society.

  4. Study on the flood simulation techniques for estimation of health risk in Dhaka city, Bangladesh

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Suetsugi, T.; Sunada, K.; ICRE

    2011-12-01

    Although some studies have been carried out on the spread of infectious disease with the flooding, the relation between flooding and the infectious expansion has not been clarified yet. The improvement of the calculation precision of inundation and its relation with the infectious disease, surveyed epidemiologically, are therefore investigated in a case study in Dhaka city, Bangladesh. The inundation was computed using a flood simulation model that is numerical 2D-model. The "sensitivity to inundation" of hydraulic factors such as drainage channel, dike, and the building occupied ratio was examined because of the lack of digital data set related to flood simulation. Each element in the flood simulation model was incorporated progressively and results were compared with the calculation result as inspection materials by the inundation classification from the existing study (Mollah et al., 2007). The results show that the influences by ''dyke'' and "drainage channel" factors are remarkable to water level near each facility. The inundation level and duration have influence on wide areas when "building occupied ratio" is also considered. The correlation between maximum inundation depth and health risk (DALY, Mortality, Morbidity) was found, but the validation of the inundation model for this case has not been performed yet. The flood simulation model needs to be validated by observed inundation depth. The drainage facilities such as sewer network or the pumping system will be also considered in the further research to improve the precision of the inundation model.

  5. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    NASA Astrophysics Data System (ADS)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  6. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less

  7. P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)

    NASA Astrophysics Data System (ADS)

    Kropp, Derek L.

    2009-05-01

    One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.

  8. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    PubMed

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the "classical" risk assessment approach with the model-based approach. These comparisons showed that TK and TK-TD models can bring more realism to the risk assessment through the possibility to study realistic exposure scenarios and to simulate relevant mechanisms of effects (including delayed toxicity and recovery). Noticeably, using TK-TD models is currently the most relevant way to directly connect realistic exposure patterns to effects. We conclude with recommendations on how to properly use TK and TK-TD model in acute risk assessment for vertebrates. © 2015 SETAC.

  9. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry

    2009-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  10. Numerical aerodynamic simulation facility feasibility study

    NASA Technical Reports Server (NTRS)

    1979-01-01

    There were three major issues examined in the feasibility study. First, the ability of the proposed system architecture to support the anticipated workload was evaluated. Second, the throughput of the computational engine (the flow model processor) was studied using real application programs. Third, the availability reliability, and maintainability of the system were modeled. The evaluations were based on the baseline systems. The results show that the implementation of the Numerical Aerodynamic Simulation Facility, in the form considered, would indeed be a feasible project with an acceptable level of risk. The technology required (both hardware and software) either already exists or, in the case of a few parts, is expected to be announced this year. Facets of the work described include the hardware configuration, software, user language, and fault tolerance.

  11. A generalised individual-based algorithm for modelling the evolution of quantitative herbicide resistance in arable weed populations.

    PubMed

    Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul

    2017-02-01

    Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  12. An example of population-level risk assessments for small mammals using individual-based population models.

    PubMed

    Schmitt, Walter; Auteri, Domenica; Bastiansen, Finn; Ebeling, Markus; Liu, Chun; Luttik, Robert; Mastitsky, Sergey; Nacci, Diane; Topping, Chris; Wang, Magnus

    2016-01-01

    This article presents a case study demonstrating the application of 3 individual-based, spatially explicit population models (IBMs, also known as agent-based models) in ecological risk assessments to predict long-term effects of a pesticide to populations of small mammals. The 3 IBMs each used a hypothetical fungicide (FungicideX) in different scenarios: spraying in cereals (common vole, Microtus arvalis), spraying in orchards (field vole, Microtus agrestis), and cereal seed treatment (wood mouse, Apodemus sylvaticus). Each scenario used existing model landscapes, which differed greatly in size and structural complexity. The toxicological profile of FungicideX was defined so that the deterministic long-term first tier risk assessment would result in high risk to small mammals, thus providing the opportunity to use the IBMs for risk assessment refinement (i.e., higher tier risk assessment). Despite differing internal model design and scenarios, results indicated in all 3 cases low population sensitivity unless FungicideX was applied at very high (×10) rates. Recovery from local population impacts was generally fast. Only when patch extinctions occured in simulations of intentionally high acute toxic effects, recovery periods, then determined by recolonization, were of any concern. Conclusions include recommendations for the most important input considerations, including the selection of exposure levels, duration of simulations, statistically robust number of replicates, and endpoints to report. However, further investigation and agreement are needed to develop recommendations for landscape attributes such as size, structure, and crop rotation to define appropriate regulatory risk assessment scenarios. Overall, the application of IBMs provides multiple advantages to higher tier ecological risk assessments for small mammals, including consistent and transparent direct links to specific protection goals, and the consideration of more realistic scenarios. © 2015 SETAC.

  13. Predicting hydrological and erosional risks in fire-affected watersheds: recent advances and research gaps

    NASA Astrophysics Data System (ADS)

    Nunes, João Pedro; Keizer, Jan Jacob

    2017-04-01

    Models can be invaluable tools to assess and manage the impacts of forest fires on hydrological and erosion processes. Immediately after fires, models can be used to identify priority areas for post-fire interventions or assess the risks of flooding and downstream contamination. In the long term, models can be used to evaluate the long-term implications of a fire regime for soil protection, surface water quality and potential management risks, or determine how changes to fire regimes, caused e.g. by climate change, can impact soil and water quality. However, several challenges make post-fire modelling particularly difficult: • Fires change vegetation cover and properties, such as by changing soil water repellency or by adding an ash layer over the soil; these processes, however are not described in currently used models, so that existing models need to be modified and tested. • Vegetation and soils recover with time since fire, changing important model parameters, so that the recovery processes themselves also need to be simulated, including the role of post-fire interventions. • During the window of vegetation and soil disturbance, particular weather conditions, such as the occurrence of severe droughts or extreme rainfall events, can have a large impact on the amount of runoff and erosion produced in burnt areas, so that models that smooth out these peak responses and rather simulate "long-term" average processes are less useful. • While existing models can simulate reasonable well slope-scale runoff generation and associated sediment losses and their catchment-scale routing, few models can accommodate the role of the ash layer or its transport by overland flow, in spite of its importance for soil fertility losses and downstream contamination. This presentation will provide an overview of the importance of post-fire hydrological and erosion modelling as well as of the challenges it faces and of recent efforts made to overcome these challenges. It will illustrate these challenges with two examples: probabilistic approaches to simulate the impact of different vegetation regrowth and post-fire climate combinations on runoff and erosion; and model developments for post-fire soil water repellency with different levels of complexity. It will also present an inventory of the current state-of-the-art and propose future research directions, both on post-fire models themselves and on their integration with other models in large-scale water resource assessment management.

  14. Prediction impact curve is a new measure integrating intervention effects in the evaluation of risk models.

    PubMed

    Campbell, William; Ganna, Andrea; Ingelsson, Erik; Janssens, A Cecile J W

    2016-01-01

    We propose a new measure of assessing the performance of risk models, the area under the prediction impact curve (auPIC), which quantifies the performance of risk models in terms of their average health impact in the population. Using simulated data, we explain how the prediction impact curve (PIC) estimates the percentage of events prevented when a risk model is used to assign high-risk individuals to an intervention. We apply the PIC to the Atherosclerosis Risk in Communities (ARIC) Study to illustrate its application toward prevention of coronary heart disease. We estimated that if the ARIC cohort received statins at baseline, 5% of events would be prevented when the risk model was evaluated at a cutoff threshold of 20% predicted risk compared to 1% when individuals were assigned to the intervention without the use of a model. By calculating the auPIC, we estimated that an average of 15% of events would be prevented when considering performance across the entire interval. We conclude that the PIC is a clinically meaningful measure for quantifying the expected health impact of risk models that supplements existing measures of model performance. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  16. Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.; Polly, B.

    2011-12-01

    This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less

  17. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.

  18. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.

  19. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.

  20. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    PubMed

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.

  1. On using TRMM data and rainfall forecasts from meteorological models in data-scarce transboundary catchments - an example of Bangladesh

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Biswa; Tohidul Islam, Md.

    2014-05-01

    This research focuses on the flood risk of the Haor region in the north-eastern part of Bangladesh. The prediction of the hydrological variables at different spatial and temporal scales in the Haor region is dependent on the influence of several upstream rivers in the Meghalaya catchment in India. Limitation in hydro-meteorological data collection and data sharing issues between the two countries dominate the feasibility of hydrological studies, particularly for near-realtime predictions. One of the possible solutions seems to be in making use of the variety of satellite based and meteorological model products for rainfall. The abundance of a variety of rainfall products provides a good basis of hydrological modelling of a part of the Ganges and Brahmaputra basin. In this research the TRMM data and rainfall forecasts from ECMWF have been compared with the scarce rain gauge data from the upstream Meghalaya catchment. Subsequently, the TRMM data and rainfall forecasts from ECMWF have been used as the meteorological input to a rainfall-runoff model of the Meghalaya catchment. The rainfall-runoff model of Meghalaya has been developed using the DEM data from SRTM. The generated runoff at the outlet of Meghalaya has been used as the upstream boundary condition in the existing rainfall-runoff model of the Haor region. The simulation results have been compared with the existing results based on simulations without any information of the rainfall-runoff in the upstream Meghalaya catchment. The comparison showed that the forecasting lead time has been substantially increased. As per the existing results the forecasting lead time at a number of locations in the catchment was about 6 to 8 hours. With the new results the forecasting lead time has gone up, with different levels of accuracy, to about 24 hours. This additional lead time will be highly beneficial in managing flood risk of the Haor region of Bangladesh. The research shows that satellite based rainfall products and rainfall forecasts from meteorological models can be very useful in flood risk management, particularly for data scarce regions and/or transboundary regions with data sharing issues. Keywords: flood risk management, TRMM, ECMWF, flood forecasting, Haor, Bangladesh. Abbreviations: TRMM: Tropical Rainfall Measuring Mission ECMWF: European Centre for Medium-Range Weather Forecasts DEM: Digital Elevation Model SRTM: Shuttle Radar Topography Mission

  2. Simulated effect of pneumococcal vaccination in the Netherlands on existing rules constructed in a non-vaccinated cohort predicting sequelae after bacterial meningitis

    PubMed Central

    2010-01-01

    Background Previously two prediction rules identifying children at risk of hearing loss and academic or behavioral limitations after bacterial meningitis were developed. Streptococcus pneumoniae as causative pathogen was an important risk factor in both. Since 2006 Dutch children receive seven-valent conjugate vaccination against S. pneumoniae. The presumed effect of vaccination was simulated by excluding all children infected by S. pneumoniae with the serotypes included in the vaccine, from both previous collected cohorts (between 1990-1995). Methods Children infected by one of the vaccine serotypes were excluded from both original cohorts (hearing loss: 70 of 628 children; academic or behavioral limitations: 26 of 182 children). All identified risk factors were included in multivariate logistic regression models. The discriminative ability of both new models was calculated. Results The same risk factors as in the original models were significant. The discriminative ability of the original hearing loss model was 0.84 and of the new model 0.87. In the academic or behavioral limitations model it was 0.83 and 0.84 respectively. Conclusion It can be assumed that the prediction rules will also be applicable on a vaccinated population. However, vaccination does not provide 100% coverage and evidence is available that serotype replacement will occur. The impact of vaccination on serotype replacement needs to be investigated, and the prediction rules must be validated externally. PMID:20815866

  3. Simulating groundwater-induced sewer flooding

    NASA Astrophysics Data System (ADS)

    Mijic, A.; Mansour, M.; Stanic, M.; Jackson, C. R.

    2016-12-01

    During the last decade, Chalk catchments of southern England experienced severe groundwater flooding. High groundwater levels resulted in the groundwater ingress into the sewer network that led to restricted toilet use and the overflow of diluted, but untreated sewage to road surfaces, rivers and water courses. In response to these events the water and sewerage company Thames Water Utilities Ltd (TWUL) had to allocate significant funds to mitigate the impacts. It was estimated that approximately £19m was spent responding to the extreme wet weather of 2013-14, along with the use of a fleet of over 100 tankers. However, the magnitude of the event was so large that these efforts could not stop the discharge of sewage to the environment. This work presents the analysis of the risk of groundwater-induced sewer flooding within the Chalk catchment of the River Lambourn, Berkshire. A spatially distributed groundwater model was used to assess historic groundwater flood risk and the potential impacts of changes in future climate. We then linked this model to an urban groundwater model to enable us to simulate groundwater-sewer interaction in detail. The modelling setup was used to identify relationships between infiltration into sewers and groundwater levels at specific points on TWUL's sewer network, and to estimate historic and future groundwater flood risk, and how this varies across the catchment. The study showed the significance of understanding the impact of groundwater on the urban water systems, and producing information that can inform a water company's response to groundwater flood risk, their decision making process and their asset management planning. However, the knowledge gained through integrated modelling of groundwater-sewer interactions has highlighted limitations of existing approaches for the simulation of these coupled systems. We conclude this work with number of recommendations about how to improve such hydrological/sewer analysis.

  4. Numerical simulation of gender differences in a long-term microgravity exposure

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    The objective of this work is to analyse and simulate gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairment which may put in jeopardy a long-term mission is also evaluated. Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numerical Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular architecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electricallike model of this control system, using inexpensive software development frameworks, and has been tested and validated with the available experimental data. Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobical exercise, and also thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Initial results are compatible with the existing data, and provide unique information regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions. More experimental work is needed to adjust some parameters of the model. This work may be seen as another contribution to a better understanding of the underlying processes involved for both women in man adaptation to long-term microgravity.

  5. On the role of numerical simulations in studies of reduced gravity-induced physiological effects in humans. Results from NELME.

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numercial Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular archi-tecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electrical-like model of this control system, using inexpensive development frameworks, and has been tested and validated with the available experimental data. The objective of this work is to analyse and simulate long-term effects and gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairement which may put in jeopardy a long-term mission is also evaluated. . Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying continuosly from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobic ex-ercise and thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Different scenarios like a long-term mission to Moon or Mars are evaluated, including countermeasures such as aerobic exercise. Initial results are compatible with the existing data, and provide useful insights regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions.

  6. Diagnostic Value of Knee Arthrometry in the Prediction of Anterior Cruciate Ligament Strain During Landing

    PubMed Central

    Kiapour, Ata M.; Wordeman, Samuel C.; Paterno, Mark V.; Quatman, Carmen E.; Levine, Jason W.; Goel, Vijay K.; Demetropoulos, Constantine K.; Hewett, Timothy E.

    2014-01-01

    Background Previous studies have indicated that higher knee joint laxity may be indicative of an increased risk of anterior cruciate ligament (ACL) injuries. Despite the frequent clinical use of knee arthrometry in the evaluation of knee laxity, little data exist to correlate instrumented laxity measures and ACL strain during dynamic high-risk activities. Purpose/Hypotheses The purpose of this study was to evaluate the relationships between ACL strain and anterior knee laxity measurements using arthrometry during both a drawer test and simulated bipedal landing (as an identified high-risk injurious task). We hypothesized that a high correlation exists between dynamic ACL strain and passive arthrometry displacement. The secondary hypothesis was that anterior knee laxity quantified by knee arthrometry is a valid predictor of injury risk such that specimens with greater anterior knee laxity would demonstrate increased levels of peak ACL strain during landing. Study Design Controlled laboratory study. Methods Twenty cadaveric lower limbs (mean age, 46 ± 6 years; 10 female and 10 male) were tested using a CompuKT knee arthrometer to measure knee joint laxity. Each specimen was tested under 4 continuous cycles of anterior-posterior shear force (±134 N) applied to the tibial tubercle. To quantify ACL strain, a differential variable reluctance transducer (DVRT) was arthroscopically placed on the ACL (anteromedial bundle), and specimens were retested. Subsequently, bipedal landing from 30 cm was simulated in a subset of 14 specimens (mean age, 45 ± 6 years; 6 female and 8 male) using a novel custom-designed drop stand. Changes in joint laxity and ACL strain under applied anterior shear force were statistically analyzed using paired sample t tests and analysis of variance. Multiple linear regression analyses were conducted to determine the relationship between anterior shear force, anterior tibial translation, and ACL strain. Results During simulated drawer tests, 134 N of applied anterior shear load produced a mean peak anterior tibial translation of 3.1 ± 1.1 mm and a mean peak ACL strain of 4.9% ± 4.3%. Anterior shear load was a significant determinant of anterior tibial translation (P <.0005) and peak ACL strain (P = .04). A significant correlation (r = 0.52, P <.0005) was observed between anterior tibial translation and ACL strain. Cadaveric simulations of landing produced a mean axial impact load of 4070 ± 732 N. Simulated landing significantly increased the mean peak anterior tibial translation to 10.4 ± 3.5 mm and the mean peak ACL strain to 6.8% ± 2.8% (P <.0005) compared with the prelanding condition. Significant correlations were observed between peak ACL strain during simulated landing and anterior tibial translation quantified by knee arthrometry. Conclusion Our first hypothesis is supported by a significant correlation between arthrometry displacement collected during laxity tests and concurrent ACL strain calculated from DVRT measurements. Experimental findings also support our second hypothesis that instrumented measures of anterior knee laxity predict peak ACL strain during landing, while specimens with greater knee laxity demonstrated higher levels of peak ACL strain during landing. Clinical Relevance The current findings highlight the importance of instrumented anterior knee laxity assessments as a potential indicator of the risk of ACL injuries in addition to its clinical utility in the evaluation of ACL integrity. PMID:24275863

  7. Bringing good teaching cases "to life": a simulator-based medical education service.

    PubMed

    Gordon, James A; Oriol, Nancy E; Cooper, Jeffrey B

    2004-01-01

    Realistic medical simulation has expanded worldwide over the last decade. Such technology is playing an increasing role in medical education not merely because simulator sessions are enjoyable, but because they can provide an enhanced environment for experiential learning and reflective thought. High-fidelity patient simulators allow students of all levels to "practice" medicine without risk, providing a natural framework for the integration of basic and clinical science in a safe environment. Often described as "flight simulation for doctors," the rationale, utility, and range of medical simulations have been described elsewhere, yet the challenges of integrating this technology into the medical school curriculum have received little attention. The authors report how Harvard Medical School established an on-campus simulator program for students in 2001, building on the work of the Center for Medical Simulation in Boston. As an overarching structure for the process, faculty and residents developed a simulator-based "medical education service"-like any other medical teaching service, but designed exclusively to help students learn on the simulator alongside a clinician-mentor, on demand. Initial evaluations among both preclinical and clinical students suggest that simulation is highly accepted and increasingly demanded. For some learners, simulation may allow complex information to be understood and retained more efficiently than can occur with traditional methods. Moreover, the process outlined here suggests that simulation can be integrated into existing curricula of almost any medical school or teaching hospital in an efficient and cost-effective manner.

  8. Is Miscanthus a High Risk Biofuel Feedstock Prospect for the Upper Midwest US?

    NASA Astrophysics Data System (ADS)

    Kucharik, C. J.; VanLoocke, A. D.

    2011-12-01

    Miscanthus is a highly productive C4 perennial rhizomatous grass that is native to Southeast Asia, but its potential as a feedstock for cellulosic biofuel in the Midwest US is intriguing given extremely high productivity for low amounts of agrochemical inputs. However, Miscanthus x giganteus, a key variety currently studied is not planted from seed, but rather from rhizomes planted at a soil depth of 5 to 10 cm. Therefore, it is costly to establish on the basis of both time and money, making it a potentially risky investment in geographic regions that experience cold wintertime temperatures that can effectively kill the crop. The 50% kill threshold for M. giganteus rhizomes occurs when soil temperatures fall below -3.5C, which may contribute to a high risk of improper establishment during the first few seasons. Our first objective here was to study a historical, simulated reconstruction of daily wintertime soil temperatures at high spatial resolution (5 min) across the Midwest US from 1948-2007, and use this information to quantify the frequency that lethal soil temperature thresholds for Miscanthus were reached. A second objective was to investigate how the use of crop residues could impact wintertime soil temperatures. In this study, a dynamic agroecosystem model (Agro-IBIS) that has been modified to simulate Miscanthus growth and phenology was used in conjunction with high-resolution datasets of soil texture and daily gridded weather data. Model simulations suggest that across the states of North and South Dakota, Nebraska, Minnesota, Wisconsin, Michigan, and the northern half of Iowa, the kill threshold of -3.5C at a 10cm soil depth was reached in 70-95% of the simulation years. A boundary representing a 50% likelihood of reaching -3.5C at 10cm depth in any given year runs approximately from east central Colorado, thought northern Kansas and Missouri, through central Illinois, central Indiana, and central Ohio. An analysis of monthly mean 10cm soil temperatures illustrates that temperatures colder than the kill threshold generally exist in January and February north and west of a line running from central Nebraska to north central Illinois, through southeastern Wisconsin and northern lower Michigan. These results suggest that a bioclimatic limit to successful establishment might be positioned somewhere through the central portion of the Corn Belt, but this depends on how risk is defined in the future. Model simulations suggest that a significant warming trend of wintertime soil temperatures existed across the region; soil temperatures have increased 3 to 4C in the past 60 years at 10cm as well as to depths as great as 50 to 100cm across northern and western portions of the Midwest. This warming trend, in combination with the strategic use of straw and other crop residues may reduce the risk of failure of establishing Miscanthus x giganteus. However, any adaptive management will not completely eliminate the high risk of cold soil temperatures in regions that are currently being targeted to support cellulosic biofuel production in the next several decades.

  9. Non prescribed sale of antibiotics in Riyadh, Saudi Arabia: A Cross Sectional Study

    PubMed Central

    2011-01-01

    Background Antibiotics sales without medical prescriptions are increasingly recognized as sources of antimicrobial misuse that can exacerbate the global burden of antibiotic resistance. We aimed to determine the percentage of pharmacies who sell antibiotics without medical prescriptions, examining the potential associated risks of such practice in Riyadh, Saudi Arabia, by simulation of different clinical scenarios. Methods A cross sectional study of a quasi-random sample of pharmacies stratified by the five regions of Riyadh. Each pharmacy was visited once by two investigators who simulated having a relative with a specific clinical illness (sore throat, acute bronchitis, otitis media, acute sinusitis, diarrhea, and urinary tract infection (UTI) in childbearing aged women). Results A total of 327 pharmacies were visited. Antibiotics were dispensed without a medical prescription in 244 (77.6%) of 327, of which 231 (95%) were dispensed without a patient request. Simulated cases of sore throat and diarrhea resulted in an antibiotic being dispensed in (90%) of encounters, followed by UTI (75%), acute bronchitis (73%), otitis media (51%) and acute sinusitis (40%). Metronidazole (89%) and ciprofloxacin (86%) were commonly given for diarrhea and UTI, respectively, whereas amoxicillin/clavulanate was dispensed (51%) for the other simulated cases. None of the pharmacists asked about antibiotic allergy history or provided information about drug interactions. Only 23% asked about pregnancy status when dispensing antibiotics for UTI-simulated cases. Conclusions We observed that an antibiotic could be obtained in Riyadh without a medical prescription or an evidence-based indication with associated potential clinical risks. Strict enforcement and adherence to existing regulations are warranted. PMID:21736711

  10. Non prescribed sale of antibiotics in Riyadh, Saudi Arabia: a cross sectional study.

    PubMed

    Bin Abdulhak, Aref A; Altannir, Mohamad A; Almansor, Mohammed A; Almohaya, Mohammed S; Onazi, Atallah S; Marei, Mohammed A; Aldossary, Oweida F; Obeidat, Sadek A; Obeidat, Mustafa A; Riaz, Muhammad S; Tleyjeh, Imad M

    2011-07-07

    Antibiotics sales without medical prescriptions are increasingly recognized as sources of antimicrobial misuse that can exacerbate the global burden of antibiotic resistance. We aimed to determine the percentage of pharmacies who sell antibiotics without medical prescriptions, examining the potential associated risks of such practice in Riyadh, Saudi Arabia, by simulation of different clinical scenarios. A cross sectional study of a quasi-random sample of pharmacies stratified by the five regions of Riyadh. Each pharmacy was visited once by two investigators who simulated having a relative with a specific clinical illness (sore throat, acute bronchitis, otitis media, acute sinusitis, diarrhea, and urinary tract infection (UTI) in childbearing aged women). A total of 327 pharmacies were visited. Antibiotics were dispensed without a medical prescription in 244 (77.6%) of 327, of which 231 (95%) were dispensed without a patient request. Simulated cases of sore throat and diarrhea resulted in an antibiotic being dispensed in (90%) of encounters, followed by UTI (75%), acute bronchitis (73%), otitis media (51%) and acute sinusitis (40%). Metronidazole (89%) and ciprofloxacin (86%) were commonly given for diarrhea and UTI, respectively, whereas amoxicillin/clavulanate was dispensed (51%) for the other simulated cases. None of the pharmacists asked about antibiotic allergy history or provided information about drug interactions. Only 23% asked about pregnancy status when dispensing antibiotics for UTI-simulated cases. We observed that an antibiotic could be obtained in Riyadh without a medical prescription or an evidence-based indication with associated potential clinical risks. Strict enforcement and adherence to existing regulations are warranted.

  11. Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea

    2015-09-01

    The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less

  12. Induced seismicity closed-form traffic light system for actuarial decision-making during deep fluid injections.

    PubMed

    Mignan, A; Broccardo, M; Wiemer, S; Giardini, D

    2017-10-19

    The rise in the frequency of anthropogenic earthquakes due to deep fluid injections is posing serious economic, societal, and legal challenges to many geo-energy and waste-disposal projects. Existing tools to assess such problems are still inherently heuristic and mostly based on expert elicitation (so-called clinical judgment). We propose, as a complementary approach, an adaptive traffic light system (ATLS) that is function of a statistical model of induced seismicity. It offers an actuarial judgement of the risk, which is based on a mapping between earthquake magnitude and risk. Using data from six underground reservoir stimulation experiments, mostly from Enhanced Geothermal Systems, we illustrate how such a data-driven adaptive forecasting system could guarantee a risk-based safety target. The proposed model, which includes a linear relationship between seismicity rate and flow rate, as well as a normal diffusion process for post-injection, is first confirmed to be representative of the data. Being integrable, the model yields a closed-form ATLS solution that is both transparent and robust. Although simulations verify that the safety target is consistently ensured when the ATLS is applied, the model from which simulations are generated is validated on a limited dataset, hence still requiring further tests in additional fluid injection environments.

  13. Wildfire exposure analysis on the national forests in the Pacific Northwest, USA.

    PubMed

    Ager, Alan A; Buonopane, Michelle; Reger, Allison; Finney, Mark A

    2013-06-01

    We analyzed wildfire exposure for key social and ecological features on the national forests in Oregon and Washington. The forests contain numerous urban interfaces, old growth forests, recreational sites, and habitat for rare and endangered species. Many of these resources are threatened by wildfire, especially in the east Cascade Mountains fire-prone forests. The study illustrates the application of wildfire simulation for risk assessment where the major threat is from large and rare naturally ignited fires, versus many previous studies that have focused on risk driven by frequent and small fires from anthropogenic ignitions. Wildfire simulation modeling was used to characterize potential wildfire behavior in terms of annual burn probability and flame length. Spatial data on selected social and ecological features were obtained from Forest Service GIS databases and elsewhere. The potential wildfire behavior was then summarized for each spatial location of each resource. The analysis suggested strong spatial variation in both burn probability and conditional flame length for many of the features examined, including biodiversity, urban interfaces, and infrastructure. We propose that the spatial patterns in modeled wildfire behavior could be used to improve existing prioritization of fuel management and wildfire preparedness activities within the Pacific Northwest region. © 2012 Society for Risk Analysis.

  14. Bias correction of risk estimates in vaccine safety studies with rare adverse events using a self-controlled case series design.

    PubMed

    Zeng, Chan; Newcomer, Sophia R; Glanz, Jason M; Shoup, Jo Ann; Daley, Matthew F; Hambidge, Simon J; Xu, Stanley

    2013-12-15

    The self-controlled case series (SCCS) method is often used to examine the temporal association between vaccination and adverse events using only data from patients who experienced such events. Conditional Poisson regression models are used to estimate incidence rate ratios, and these models perform well with large or medium-sized case samples. However, in some vaccine safety studies, the adverse events studied are rare and the maximum likelihood estimates may be biased. Several bias correction methods have been examined in case-control studies using conditional logistic regression, but none of these methods have been evaluated in studies using the SCCS design. In this study, we used simulations to evaluate 2 bias correction approaches-the Firth penalized maximum likelihood method and Cordeiro and McCullagh's bias reduction after maximum likelihood estimation-with small sample sizes in studies using the SCCS design. The simulations showed that the bias under the SCCS design with a small number of cases can be large and is also sensitive to a short risk period. The Firth correction method provides finite and less biased estimates than the maximum likelihood method and Cordeiro and McCullagh's method. However, limitations still exist when the risk period in the SCCS design is short relative to the entire observation period.

  15. [SOPHOCLE (Ophthalmologic Simulator of Laser PHOtocoagulation): contribution to virtual reality].

    PubMed

    Rouland, J F; Dubois, P; Chaillou, C; Meseuree, P; Karpf, S; Godin, S; Duquenoy, F

    1995-01-01

    This study was undertaken to teach laser retinal photocoagulation in different disorders using a "virtual eye". Most ophthalmologists routinely use laser photocoagulator. Both indications and laser effects are well-known. However, in various diseases (diabetic retinopathy, age-related-macular degeneration, myopia...) complications rate increase or at least does not decrease. The main reasons are: - ignorance of risk factors, - misuse of the instrument. We developed a new automated device stimulating a real laser photocoagulator. Only slit-lamp exists. The three-mirror lens, the fundus and the retinal photocoagulation impacts are "virtual". The aim of the simulator is to help practitioners to recognize various pathologies almost as in real conditions and to be familiar with different technics of photocoagulation. By using computer assisted learning, a constant evaluation determines the level and the progress of practitioners.

  16. Monetizing Leakage Risk of Geologic CO2 Storage using Wellbore Permeability Frequency Distributions

    NASA Astrophysics Data System (ADS)

    Bielicki, Jeffrey; Fitts, Jeffrey; Peters, Catherine; Wilson, Elizabeth

    2013-04-01

    Carbon dioxide (CO2) may be captured from large point sources (e.g., coal-fired power plants, oil refineries, cement manufacturers) and injected into deep sedimentary basins for storage, or sequestration, from the atmosphere. This technology—CO2 Capture and Storage (CCS)—may be a significant component of the portfolio of technologies deployed to mitigate climate change. But injected CO2, or the brine it displaces, may leak from the storage reservoir through a variety of natural and manmade pathways, including existing wells and wellbores. Such leakage will incur costs to a variety of stakeholders, which may affect the desirability of potential CO2 injection locations as well as the feasibility of the CCS approach writ large. Consequently, analyzing and monetizing leakage risk is necessary to develop CCS as a viable technological option to mitigate climate change. Risk is the product of the probability of an outcome and the impact of that outcome. Assessment of leakage risk from geologic CO2 storage reservoirs requires an analysis of the probabilities and magnitudes of leakage, identification of the outcomes that may result from leakage, and an assessment of the expected economic costs of those outcomes. One critical uncertainty regarding the rate and magnitude of leakage is determined by the leakiness of the well leakage pathway. This leakiness is characterized by a leakage permeability for the pathway, and recent work has sought to determine frequency distributions for the leakage permeabilities of wells and wellbores. We conduct a probabilistic analysis of leakage and monetized leakage risk for CO2 injection locations in the Michigan Sedimentary Basin (USA) using empirically derived frequency distributions for wellbore leakage permeabilities. To conduct this probabilistic risk analysis, we apply the RISCS (Risk Interference of Subsurface CO2 Storage) model (Bielicki et al, 2013a, 2012b) to injection into the Mt. Simon Sandstone. RISCS monetizes leakage risk by combining 3D geospatial data with fluid-flow simulations from the ELSA (Estimating Leakage Semi-Analytically) model (e.g., Celia and Nordbotten, 2006) and the Leakage Impact Valuation (LIV) method (Pollak et al, 2013; Bielicki et al, 2013). We extend RISCS to iterate ELSA semi-analytic modeling simulations by drawing values from the frequency distribution of leakage permeabilities. The iterations assign these values to existing wells in the basin, and the probabilistic risk analysis thus incorporates the uncertainty of the extent of leakage. We show that monetized leakage risk can vary significantly over tens of kilometers, and we identify "hot spots" favorable to CO2 injection based on the monetized leakage risk for each potential location in the basin.

  17. Driving Performance on the Descending Limb of Blood Alcohol Concentration (BAC) in Undergraduate Students: A Pilot Study

    PubMed Central

    Silvey, Dustin; Behm, David; Albert, Wayne J.

    2015-01-01

    Young drivers are overrepresented in collisions resulting in fatalities. It is not uncommon for young drivers to socially binge drink and decide to drive a vehicle a few hours after consumption. To better understand the risks that may be associated with this behaviour, the present study has examined the effects of a social drinking bout followed by a simulated drive in undergraduate students on the descending limb of their BAC (blood alcohol concentration) curve. Two groups of eight undergraduate students (n = 16) took part in this study. Participants in the alcohol group were assessed before drinking, then at moderate and low BAC as well as 24 hours post-acute consumption. This group consumed an average of 5.3 ± 1.4 (mean ± SD) drinks in an hour in a social context and were then submitted to a driving and a predicted crash risk assessment. The control group was assessed at the same time points without alcohol intake or social context.; at 8 a.m., noon, 3 p.m. and 8 a.m. the next morning. These multiple time points were used to measure any potential learning effects from the assessment tools (i.e. driving simulator and useful field of view test (UFOV)). Diminished driving performance at moderate BAC was observed with no increases in predicted crash risk. Moderate correlations between driving variables were observed. No association exists between driving variables and UFOV variables. The control group improved measures of selective attention after the third asessement. No learning effect was observed from multiple sessions with the driving simulator. Our results show that a moderate BAC, although legal, increases the risky behaviour. Effects of alcohol expectancy could have been displayed by the experimental group. UFOV measures and predicted crash risk categories were not sentitive enough to predict crash risk for young drivers, even when intoxicated. PMID:25723618

  18. Driving performance on the descending limb of blood alcohol concentration (BAC) in undergraduate students: a pilot study.

    PubMed

    Tremblay, Mathieu; Gallant, François; Lavallière, Martin; Chiasson, Martine; Silvey, Dustin; Behm, David; Albert, Wayne J; Johnson, Michel J

    2015-01-01

    Young drivers are overrepresented in collisions resulting in fatalities. It is not uncommon for young drivers to socially binge drink and decide to drive a vehicle a few hours after consumption. To better understand the risks that may be associated with this behaviour, the present study has examined the effects of a social drinking bout followed by a simulated drive in undergraduate students on the descending limb of their BAC (blood alcohol concentration) curve. Two groups of eight undergraduate students (n = 16) took part in this study. Participants in the alcohol group were assessed before drinking, then at moderate and low BAC as well as 24 hours post-acute consumption. This group consumed an average of 5.3 ± 1.4 (mean ± SD) drinks in an hour in a social context and were then submitted to a driving and a predicted crash risk assessment. The control group was assessed at the same time points without alcohol intake or social context.; at 8 a.m., noon, 3 p.m. and 8 a.m. the next morning. These multiple time points were used to measure any potential learning effects from the assessment tools (i.e. driving simulator and useful field of view test (UFOV)). Diminished driving performance at moderate BAC was observed with no increases in predicted crash risk. Moderate correlations between driving variables were observed. No association exists between driving variables and UFOV variables. The control group improved measures of selective attention after the third assessment. No learning effect was observed from multiple sessions with the driving simulator. Our results show that a moderate BAC, although legal, increases the risky behaviour. Effects of alcohol expectancy could have been displayed by the experimental group. UFOV measures and predicted crash risk categories were not sensitive enough to predict crash risk for young drivers, even when intoxicated.

  19. Ecological risk evaluation of combined pollution of herbicide siduron and heavy metals in soils.

    PubMed

    Jiang, Rong; Wang, Meie; Chen, Weiping; Li, Xuzhi

    2018-06-01

    Combined pollution of agrichemicals and heavy metals in urban lawn soils were commonly observed throughout the world, and the co-existed two chemicals could interact with each other both in environment behavior and toxic effect. However, little has been reported on the ecological risk of their combined pollution, especially in field due to lack of systematic methodology. In this study, four soils (C, N1, N2, N3) from two public parks in Beijing, China, with similar properties but contrasting heavy metal contaminated level were chosen to assess the ecological risks of co-existed herbicide siduron and heavy metals. Environmental behaviors of siduron in studied soils were investigated with batch experiments in lab, based on which the environmental exposure level of siduron was simulated with HYDRUS-1D. Results suggested that soil organic matter (SOM) rather than the co-existed heavy metals was the dominant factor affecting the fate and the accumulation of siduron in soils. Soil N2 with the highest SOM, showed the strongest tendency to retain siduron among the studied soils. Significant joint effect of siduron and heavy metals on cucumber root elongation was observed through lab experiments. Thus, the joint toxicity of siduron and heavy metals were calculated based on single toxicology data of them using independent action (IA) and concentration addition (CA) model. Then, the predicted no effect concentration (PNEC soil ) of siduron was calculated with equilibrium partitioning method and extrapolation techniques. The PNEC soil of siduron was the lowest in heaviest heavy metal contaminated soil N3. The risk characterization ratios (RCR) of siduron in four soils were all >1. The highest RCR of siduron in soil N3 suggested that it was the joint toxicity of siduron and heavy metals to organisms determining the ecological risks of siduron in combined polluted soils. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Development of a simulation of the surficial groundwater system for the CONUS

    NASA Astrophysics Data System (ADS)

    Zell, W.; Sanford, W. E.

    2016-12-01

    Water resource and environmental managers across the country face a variety of questions involving groundwater availability and/or groundwater transport pathways. Emerging management questions require prediction of groundwater response to changing climate regimes (e.g., how drought-induced water-table recession may degrade near-stream vegetation and result in increased wildfire risks), while existing questions can require identification of current groundwater contributions to surface water (e.g., groundwater linkages between landscape contaminant inputs and receiving streams may help explain in-stream phenomena such as fish intersex). At present, few national-coverage simulation tools exist to help characterize groundwater contributions to receiving streams and predict potential changes in base-flow regimes under changing climate conditions. We will describe the Phase 1 development of a simulation of the water table and shallow groundwater system for the entire CONUS. We use national-scale datasets such as the National Recharge Map and the Map Database for Surficial Materials in the CONUS to develop groundwater flow (MODFLOW) and transport (MODPATH) models that are calibrated against groundwater level and stream elevation data from NWIS and NHD, respectively. Phase 1 includes the development of a national transmissivity map for the surficial groundwater system and examines the impact of model-grid resolution on the simulated steady-state discharge network (and associated recharge areas) and base-flow travel time distributions for different HUC scales. In the course of developing the transmissivity map we show that transmissivity in fractured bedrock systems is dependent on depth to water. Subsequent phases of this work will simulate water table changes at a monthly time step (using MODIS-dependent recharge estimates) and serve as a critical complement to surface-water-focused USGS efforts to provide national coverage hydrologic modeling tools.

  1. Development of a neural net paradigm that predicts simulator sickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgood, G.O.

    1993-03-01

    A disease exists that affects pilots and aircrew members who use Navy Operational Flight Training Systems. This malady, commonly referred to as simulator sickness and whose symptomatology closely aligns with that of motion sickness, can compromise the use of these systems because of a reduced utilization factor, negative transfer of training, and reduction in combat readiness. A report is submitted that develops an artificial neural network (ANN) and behavioral model that predicts the onset and level of simulator sickness in the pilots and aircrews who sue these systems. It is proposed that the paradigm could be implemented in real timemore » as a biofeedback monitor to reduce the risk to users of these systems. The model captures the neurophysiological impact of use (human-machine interaction) by developing a structure that maps the associative and nonassociative behavioral patterns (learned expectations) and vestibular (otolith and semicircular canals of the inner ear) and tactile interaction, derived from system acceleration profiles, onto an abstract space that predicts simulator sickness for a given training flight.« less

  2. BWR station blackout: A RISMC analysis using RAVEN and RELAP5-3D

    DOE PAGES

    Mandelli, D.; Smith, C.; Riley, T.; ...

    2016-01-01

    The existing fleet of nuclear power plants is in the process of extending its lifetime and increasing the power generated from these plants via power uprates and improved operations. In order to evaluate the impact of these factors on the safety of the plant, the Risk-Informed Safety Margin Characterization (RISMC) project aims to provide insights to decision makers through a series of simulations of the plant dynamics for different initial conditions and accident scenarios. This paper presents a case study in order to show the capabilities of the RISMC methodology to assess impact of power uprate of a Boiling Watermore » Reactor system during a Station Black-Out accident scenario. We employ a system simulator code, RELAP5-3D, coupled with RAVEN which perform the stochastic analysis. Furthermore, our analysis is performed by: 1) sampling values from a set of parameters from the uncertainty space of interest, 2) simulating the system behavior for that specific set of parameter values and 3) analyzing the outcomes from the set of simulation runs.« less

  3. Model of load balancing using reliable algorithm with multi-agent system

    NASA Astrophysics Data System (ADS)

    Afriansyah, M. F.; Somantri, M.; Riyadi, M. A.

    2017-04-01

    Massive technology development is linear with the growth of internet users which increase network traffic activity. It also increases load of the system. The usage of reliable algorithm and mobile agent in distributed load balancing is a viable solution to handle the load issue on a large-scale system. Mobile agent works to collect resource information and can migrate according to given task. We propose reliable load balancing algorithm using least time first byte (LFB) combined with information from the mobile agent. In system overview, the methodology consisted of defining identification system, specification requirements, network topology and design system infrastructure. The simulation method for simulated system was using 1800 request for 10 s from the user to the server and taking the data for analysis. Software simulation was based on Apache Jmeter by observing response time and reliability of each server and then compared it with existing method. Results of performed simulation show that the LFB method with mobile agent can perform load balancing with efficient systems to all backend server without bottleneck, low risk of server overload, and reliable.

  4. Use of High-resolution WRF Simulations to Forecast Lightning Threat

    NASA Technical Reports Server (NTRS)

    McCaul, William E.; LaCasse, K.; Goodman, S. J.

    2006-01-01

    Recent observational studies have confirmed the existence of a robust statistical relationship between lightning flash rates and the amount of large precipitating ice hydrometeors in storms. This relationship is exploited, in conjunction with the capabilities of recent forecast models such as WRF, to forecast the threat of lightning from convective storms using the output fields from the model forecasts. The simulated vertical flux of graupel at -15C is used in this study as a proxy for charge separation processes and their associated lightning risk. Six-h simulations are conducted for a number of case studies for which three-dimensional lightning validation data from the North Alabama Lightning Mapping Array are available. Experiments indicate that initialization of the WRF model on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity and reflectivity fields, and METAR and ACARS data yield the most realistic simulations. An array of subjective and objective statistical metrics are employed to document the utility of the WRF forecasts. The simulation results are also compared to other more traditional means of forecasting convective storms, such as those based on inspection of the convective available potential energy field.

  5. Cross-Milieu Terrorist Collaboration: Using Game Theory to Assess the Risk of a Novel Threat.

    PubMed

    Ackerman, Gary A; Zhuang, Jun; Weerasuriya, Sitara

    2017-02-01

    This article uses a game-theoretic approach to analyze the risk of cross-milieu terrorist collaboration-the possibility that, despite marked ideological differences, extremist groups from very different milieus might align to a degree where operational collaboration against Western societies becomes possible. Based upon theoretical insights drawn from a variety of literatures, a bargaining model is constructed that reflects the various benefits and costs for terrorists' collaboration across ideological milieus. Analyzed in both sequential and simultaneous decision-making contexts and through numerical simulations, the model confirms several theoretical arguments. The most important of these is that although likely to be quite rare, successful collaboration across terrorist milieus is indeed feasible in certain circumstances. The model also highlights several structural elements that might play a larger role than previously recognized in the collaboration decision, including that the prospect of nonmaterial gains (amplification of terror and reputational boost) plays at least as important a role in the decision to collaborate as potential increased capabilities does. Numerical simulation further suggests that prospects for successful collaboration over most scenarios (including operational) increase when a large, effective Islamist terrorist organization initiates collaboration with a smaller right-wing group, as compared with the other scenarios considered. Although the small number of historical cases precludes robust statistical validation, the simulation results are supported by existing empirical evidence of collaboration between Islamists and right- or left-wing extremists. The game-theoretic approach, therefore, provides guidance regarding the circumstances under which such an unholy alliance of violent actors is likely to succeed. © 2016 Society for Risk Analysis.

  6. A synthetic method for atmospheric diffusion simulation and environmental impact assessment of accidental pollution in the chemical industry in a WEBGIS context.

    PubMed

    Ni, Haochen; Rui, Yikang; Wang, Jiechen; Cheng, Liang

    2014-09-05

    The chemical industry poses a potential security risk to factory personnel and neighboring residents. In order to mitigate prospective damage, a synthetic method must be developed for an emergency response. With the development of environmental numeric simulation models, model integration methods, and modern information technology, many Decision Support Systems (DSSs) have been established. However, existing systems still have limitations, in terms of synthetic simulation and network interoperation. In order to resolve these limitations, the matured simulation model for chemical accidents was integrated into the WEB Geographic Information System (WEBGIS) platform. The complete workflow of the emergency response, including raw data (meteorology information, and accident information) management, numeric simulation of different kinds of accidents, environmental impact assessments, and representation of the simulation results were achieved. This allowed comprehensive and real-time simulation of acute accidents in the chemical industry. The main contribution of this paper is that an organizational mechanism of the model set, based on the accident type and pollutant substance; a scheduling mechanism for the parallel processing of multi-accident-type, multi-accident-substance, and multi-simulation-model; and finally a presentation method for scalar and vector data on the web browser on the integration of a WEB Geographic Information System (WEBGIS) platform. The outcomes demonstrated that this method could provide effective support for deciding emergency responses of acute chemical accidents.

  7. A Synthetic Method for Atmospheric Diffusion Simulation and Environmental Impact Assessment of Accidental Pollution in the Chemical Industry in a WEBGIS Context

    PubMed Central

    Ni, Haochen; Rui, Yikang; Wang, Jiechen; Cheng, Liang

    2014-01-01

    The chemical industry poses a potential security risk to factory personnel and neighboring residents. In order to mitigate prospective damage, a synthetic method must be developed for an emergency response. With the development of environmental numeric simulation models, model integration methods, and modern information technology, many Decision Support Systems (DSSs) have been established. However, existing systems still have limitations, in terms of synthetic simulation and network interoperation. In order to resolve these limitations, the matured simulation model for chemical accidents was integrated into the WEB Geographic Information System (WEBGIS) platform. The complete workflow of the emergency response, including raw data (meteorology information, and accident information) management, numeric simulation of different kinds of accidents, environmental impact assessments, and representation of the simulation results were achieved. This allowed comprehensive and real-time simulation of acute accidents in the chemical industry. The main contribution of this paper is that an organizational mechanism of the model set, based on the accident type and pollutant substance; a scheduling mechanism for the parallel processing of multi-accident-type, multi-accident-substance, and multi-simulation-model; and finally a presentation method for scalar and vector data on the web browser on the integration of a WEB Geographic Information System (WEBGIS) platform. The outcomes demonstrated that this method could provide effective support for deciding emergency responses of acute chemical accidents. PMID:25198686

  8. The effects of on-street parking and road environment visual complexity on travel speed and reaction time.

    PubMed

    Edquist, Jessica; Rudin-Brown, Christina M; Lenné, Michael G

    2012-03-01

    On-street parking is associated with elevated crash risk. It is not known how drivers' mental workload and behaviour in the presence of on-street parking contributes to, or fails to reduce, this increased crash risk. On-street parking tends to co-exist with visually complex streetscapes that may affect workload and crash risk in their own right. The present paper reports results from a driving simulator study examining the effects of on-street parking and road environment visual complexity on driver behaviour and surrogate measures of crash risk. Twenty-nine participants drove a simulated urban commercial and arterial route. Compared to sections with no parking bays or empty parking bays, in the presence of occupied parking bays drivers lowered their speed and shifted their lateral position towards roadway centre to compensate for the higher mental workload they reported experiencing. However, this compensation was not sufficient to reduce drivers' reaction time on a safety-relevant peripheral detection task or to an unexpected pedestrian hazard. Compared to the urban road environments, the less visually complex arterial road environment was associated with speeds that were closer to the posted limit, lower speed variability and lower workload ratings. These results support theoretical positions that proffer workload as a mediating variable of speed choice. However, drivers in this study did not modify their speed sufficiently to maintain safe hazard response times in complex environments with on-street parking. This inadequate speed compensation is likely to affect real world crash risk. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Projected Flood Risks in China based on CMIP5

    NASA Astrophysics Data System (ADS)

    Xu, Ying

    2016-04-01

    Based on the simulations from 22 CMIP5 models and in combination with data on population, GDP, arable land, and terrain elevation, the spatial distributions of the flood risk levels are calculated and analyzed under RCP8.5 for the baseline period (1986-2005), the near term future period (2016-2035), the middle term future period (2046-2065), and the long term future period (2080-2099). (1) Areas with higher flood hazard risk levels in the future are concentrated in southeastern China, and the areas with the risk level III continue to expand. The major changes in flood hazard risks will occur in the middle and long term future. (2) In future, the areas of high vulnerability to flood hazards will be located in China's eastern region. In the middle and late 21st century, the extent of the high vulnerability area will expand eastward and its intensity will gradually increase. The highest vulnerability values are found in the provinces of Beijing, Tianjin, Hebei, Henan, Anhui, Shandong, Shanghai, Jiangsu, and in parts of the Pearl River Delta. Furthermore, the major cities in northeast China, as well as Wuhan, Changsha and Nanchang are highly vulnerable. (3) The regions with high flood risk levels will be located in eastern China, in the middle and lower reaches of Yangtze River and stretching northward to Beijing and Tianjin. High-risk flood areas are also occurring in major cities in Northeast China, in some parts of Shaanxi and Shanxi, and in some coastal areas in Southeast China. (4) Compared to the baseline period, the high flood risks will increase on a regional level towards the end of the 21st century, although the areas of flood hazards show little variation. In this paper, the projected future flood risks for different periods were analyzed under the RCP8.5 emission scenarios. By comparing the results with the simulations under the RCP 2.6 and RCP 4.5 scenarios, both scenarios show no differences in the spatial distribution, but in the intensity of flood hazard risks, which are weaker than for the RCP8.5 scenarios. By using the simulations from climate model ensembles to project future flood risks, uncertainty exists for various factors, such as the coarse resolution of global climate models, different approaches to flood assessments, the selection of the weighting coefficients, as well as the used greenhouse gas emission scheme, and the estimations of future population, GDP, and arable land. Therefore, further analysis is needed to reduce the uncertainties of future flood risks.

  10. Risk Based Reservoir Operations Using Ensemble Streamflow Predictions for Lake Mendocino in Mendocino County, California

    NASA Astrophysics Data System (ADS)

    Delaney, C.; Mendoza, J.; Whitin, B.; Hartman, R. K.

    2017-12-01

    Ensemble Forecast Operations (EFO) is a risk based approach of reservoir flood operations that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, each member of an ESP is individually modeled to forecast system conditions and calculate risk of reaching critical operational thresholds. Reservoir release decisions are computed which seek to manage forecasted risk to established risk tolerance levels. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC, which approximates flow forecasts for 61 ensemble members for a 15-day horizon. Model simulation results of the EFO alternative demonstrate a 36% increase in median end of water year (September 30) storage levels over existing operations. Additionally, model results show no increase in occurrence of flows above flood stage for points downstream of Lake Mendocino. This investigation demonstrates that the EFO alternative may be a viable approach for managing Lake Mendocino for multiple purposes (water supply, flood mitigation, ecosystems) and warrants further investigation through additional modeling and analysis.

  11. Scalable Integrated Multi-Mission Support System (SIMSS) Simulator Release 2.0 for GMSEC

    NASA Technical Reports Server (NTRS)

    Kim, John; Velamuri, Sarma; Casey, Taylor; Bemann, Travis

    2012-01-01

    Scalable Integrated Multi-Mission Support System (SIMSS) Simulator Release 2.0 software is designed to perform a variety of test activities related to spacecraft simulations and ground segment checks. This innovation uses the existing SIMSS framework, which interfaces with the GMSEC (Goddard Mission Services Evolution Center) Application Programming Interface (API) Version 3.0 message middleware, and allows SIMSS to accept GMSEC standard messages via the GMSEC message bus service. SIMSS is a distributed, component-based, plug-and-play client-server system that is useful for performing real-time monitoring and communications testing. SIMSS runs on one or more workstations, and is designed to be user-configurable, or to use predefined configurations for routine operations. SIMSS consists of more than 100 modules that can be configured to create, receive, process, and/or transmit data. The SIMSS/GMSEC innovation is intended to provide missions with a low-cost solution for implementing their ground systems, as well as to significantly reduce a mission s integration time and risk.

  12. Development of cost-effective surfactant flooding technology. Annual report for the period, September 30, 1993--September 29, 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G.; Sepehrnoori, K.

    1995-08-01

    This research consists of the parallel development of a new chemical flooding simulator and the application of our existing UTCHEM simulation code to model surfactant flooding. The new code is based upon a completely new numerical method that combines for the first time higher-order finite-difference methods, flux limiters, and implicit algorithms. Results indicate that this approach has significant advantages in some problems and will likely enable us to simulate much larger and more realistic chemical floods once it is fully developed. Additional improvements have also been made to the UTCHEM code, and it has been applied to the study ofmore » stochastic reservoirs with and without horizontal wells to evaluate methods to reduce the cost and risk of surfactant flooding. During the second year of this contract, we have already made significant progress on both of these tasks and are ahead of schedule on both of them.« less

  13. Climate change vulnerability for species-Assessing the assessments.

    PubMed

    Wheatley, Christopher J; Beale, Colin M; Bradbury, Richard B; Pearce-Higgins, James W; Critchlow, Rob; Thomas, Chris D

    2017-09-01

    Climate change vulnerability assessments are commonly used to identify species at risk from global climate change, but the wide range of methodologies available makes it difficult for end users, such as conservation practitioners or policymakers, to decide which method to use as a basis for decision-making. In this study, we evaluate whether different assessments consistently assign species to the same risk categories and whether any of the existing methodologies perform well at identifying climate-threatened species. We compare the outputs of 12 climate change vulnerability assessment methodologies, using both real and simulated species, and validate the methods using historic data for British birds and butterflies (i.e. using historical data to assign risks and more recent data for validation). Our results show that the different vulnerability assessment methods are not consistent with one another; different risk categories are assigned for both the real and simulated sets of species. Validation of the different vulnerability assessments suggests that methods incorporating historic trend data into the assessment perform best at predicting distribution trends in subsequent time periods. This study demonstrates that climate change vulnerability assessments should not be used interchangeably due to the poor overall agreement between methods when considering the same species. The results of our validation provide more support for the use of trend-based rather than purely trait-based approaches, although further validation will be required as data become available. © 2017 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  14. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    PubMed

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  15. Statistical surrogate models for prediction of high-consequence climate change.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less

  16. Identifying Causal Variants at Loci with Multiple Signals of Association

    PubMed Central

    Hormozdiari, Farhad; Kostem, Emrah; Kang, Eun Yong; Pasaniuc, Bogdan; Eskin, Eleazar

    2014-01-01

    Although genome-wide association studies have successfully identified thousands of risk loci for complex traits, only a handful of the biologically causal variants, responsible for association at these loci, have been successfully identified. Current statistical methods for identifying causal variants at risk loci either use the strength of the association signal in an iterative conditioning framework or estimate probabilities for variants to be causal. A main drawback of existing methods is that they rely on the simplifying assumption of a single causal variant at each risk locus, which is typically invalid at many risk loci. In this work, we propose a new statistical framework that allows for the possibility of an arbitrary number of causal variants when estimating the posterior probability of a variant being causal. A direct benefit of our approach is that we predict a set of variants for each locus that under reasonable assumptions will contain all of the true causal variants with a high confidence level (e.g., 95%) even when the locus contains multiple causal variants. We use simulations to show that our approach provides 20–50% improvement in our ability to identify the causal variants compared to the existing methods at loci harboring multiple causal variants. We validate our approach using empirical data from an expression QTL study of CHI3L2 to identify new causal variants that affect gene expression at this locus. CAVIAR is publicly available online at http://genetics.cs.ucla.edu/caviar/. PMID:25104515

  17. Identifying causal variants at loci with multiple signals of association.

    PubMed

    Hormozdiari, Farhad; Kostem, Emrah; Kang, Eun Yong; Pasaniuc, Bogdan; Eskin, Eleazar

    2014-10-01

    Although genome-wide association studies have successfully identified thousands of risk loci for complex traits, only a handful of the biologically causal variants, responsible for association at these loci, have been successfully identified. Current statistical methods for identifying causal variants at risk loci either use the strength of the association signal in an iterative conditioning framework or estimate probabilities for variants to be causal. A main drawback of existing methods is that they rely on the simplifying assumption of a single causal variant at each risk locus, which is typically invalid at many risk loci. In this work, we propose a new statistical framework that allows for the possibility of an arbitrary number of causal variants when estimating the posterior probability of a variant being causal. A direct benefit of our approach is that we predict a set of variants for each locus that under reasonable assumptions will contain all of the true causal variants with a high confidence level (e.g., 95%) even when the locus contains multiple causal variants. We use simulations to show that our approach provides 20-50% improvement in our ability to identify the causal variants compared to the existing methods at loci harboring multiple causal variants. We validate our approach using empirical data from an expression QTL study of CHI3L2 to identify new causal variants that affect gene expression at this locus. CAVIAR is publicly available online at http://genetics.cs.ucla.edu/caviar/. Copyright © 2014 by the Genetics Society of America.

  18. Combining inferences from models of capture efficiency, detectability, and suitable habitat to classify landscapes for conservation of threatened bull trout

    USGS Publications Warehouse

    Peterson, J.; Dunham, J.B.

    2003-01-01

    Effective conservation efforts for at-risk species require knowledge of the locations of existing populations. Species presence can be estimated directly by conducting field-sampling surveys or alternatively by developing predictive models. Direct surveys can be expensive and inefficient, particularly for rare and difficult-to-sample species, and models of species presence may produce biased predictions. We present a Bayesian approach that combines sampling and model-based inferences for estimating species presence. The accuracy and cost-effectiveness of this approach were compared to those of sampling surveys and predictive models for estimating the presence of the threatened bull trout ( Salvelinus confluentus ) via simulation with existing models and empirical sampling data. Simulations indicated that a sampling-only approach would be the most effective and would result in the lowest presence and absence misclassification error rates for three thresholds of detection probability. When sampling effort was considered, however, the combined approach resulted in the lowest error rates per unit of sampling effort. Hence, lower probability-of-detection thresholds can be specified with the combined approach, resulting in lower misclassification error rates and improved cost-effectiveness.

  19. Wildfire exposure and fuel management on western US national forests.

    PubMed

    Ager, Alan A; Day, Michelle A; McHugh, Charles W; Short, Karen; Gilbertson-Day, Julie; Finney, Mark A; Calkin, David E

    2014-12-01

    Substantial investments in fuel management activities on national forests in the western US are part of a national strategy to reduce human and ecological losses from catastrophic wildfire and create fire resilient landscapes. Prioritizing these investments within and among national forests remains a challenge, partly because a comprehensive assessment that establishes the current wildfire risk and exposure does not exist, making it difficult to identify national priorities and target specific areas for fuel management. To gain a broader understanding of wildfire exposure in the national forest system, we analyzed an array of simulated and empirical data on wildfire activity and fuel treatment investments on the 82 western US national forests. We first summarized recent fire data to examine variation among the Forests in ignition frequency and burned area in relation to investments in fuel reduction treatments. We then used simulation modeling to analyze fine-scale spatial variation in burn probability and intensity. We also estimated the probability of a mega-fire event on each of the Forests, and the transmission of fires ignited on national forests to the surrounding urban interface. The analysis showed a good correspondence between recent area burned and predictions from the simulation models. The modeling also illustrated the magnitude of the variation in both burn probability and intensity among and within Forests. Simulated burn probabilities in most instances were lower than historical, reflecting fire exclusion on many national forests. Simulated wildfire transmission from national forests to the urban interface was highly variable among the Forests. We discuss how the results of the study can be used to prioritize investments in hazardous fuel reduction within a comprehensive multi-scale risk management framework. Published by Elsevier Ltd.

  20. Exploring the population-level impact of antiretroviral treatment: the influence of baseline intervention context.

    PubMed

    Mishra, Sharmistha; Mountain, Elisa; Pickles, Michael; Vickerman, Peter; Shastri, Suresh; Gilks, Charles; Dhingra, Nandini K; Washington, Reynold; Becker, Marissa L; Blanchard, James F; Alary, Michel; Boily, Marie-Claude

    2014-01-01

    To compare the potential population-level impact of expanding antiretroviral treatment (ART) in HIV epidemics concentrated among female sex workers (FSWs) and clients, with and without existing condom-based FSW interventions. Mathematical model of heterosexual HIV transmission in south India. We simulated HIV epidemics in three districts to assess the 10-year impact of existing ART programs (ART eligibility at CD4 cell count ≤350) beyond that achieved with high condom use, and the incremental benefit of expanding ART by either increasing ART eligibility, improving access to care, or prioritizing ART expansion to FSWs/clients. Impact was estimated in the total population (including FSWs and clients). In the presence of existing condom-based interventions, existing ART programs (medium-to-good coverage) were predicted to avert 11-28% of remaining HIV infections between 2014 and 2024. Increasing eligibility to all risk groups prevented an incremental 1-15% over existing ART programs, compared with 29-53% when maximizing access to all risk groups. If there was no condom-based intervention, and only poor ART coverage, then expanding ART prevented a larger absolute number but a smaller relative fraction of HIV infections for every additional person-year of ART. Across districts and baseline interventions, for every additional person-year of treatment, prioritizing access to FSWs was most efficient (and resource saving), followed by prioritizing access to FSWs and clients. The relative and absolute benefit of ART expansion depends on baseline condom use, ART coverage, and epidemic size. In south India, maximizing FSWs' access to care, followed by maximizing clients' access are the most efficient ways to expand ART for HIV prevention, across baseline intervention context.

  1. Dual-Spool Turbine Facility Design Overview

    NASA Technical Reports Server (NTRS)

    Giel, Paul; Pachlhofer, Pete

    2003-01-01

    The next generation of aircraft engines, both commercial and military, will attempt to capitalize on the benefits of close-coupled, vaneless, counter-rotating turbine systems. Experience has shown that significant risks and challenges are present with close-coupled systems in terms of efficiency and durability. The UEET program needs to demonstrate aerodynamic loading and efficiency goals for close-coupled, reduced-stage HP/LP turbine systems as a Level 1 Milestone for FY05. No research facility exists in the U.S. to provide risk reduction for successful development of close-coupled, high and low pressure turbine systems for the next generations of engines. To meet these objectives, the design, construction, and integrated systems testing of a Dual-Spool Turbine Facility (DSTF) facility has been initiated at the NASA Glenn Research Center. The facility will be a warm (-IOOO'F), continuous flow facility for overall aerodynamic performance and detailed flow field measurement acquisition. The facility will have state-of-the-art instrumentation to capture flow physics details. Accurate and reliable speed control will be achieved by utilizing the existing Variable Frequency Drive System. Utilization of this and other existing GRC centralized utilities will reduce the overall construction costs. The design allows for future installation of a turbine inlet combustor profile simulator. This presentation details the objectives of the facility and the concepts used in specifying its capabilities. Some preliminary design results will be presented along with a discussion of plans and schedules.

  2. System-Level Reuse of Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Hazen, Michael R.; Williams, Joseph C.

    2004-01-01

    One of the best ways to enhance space systems simulation fidelity is to leverage off of (reuse) existing high-fidelity simulations. But what happens when the model you would like to reuse is in a different coding language or other barriers arise that make one want to just start over with a clean sheet of paper? Three diverse system-level simulation reuse case studies are described based on experience to date in the development of NASA's Space Station Training Facility (SSTF) at the Johnson Space Center in Houston, Texas. Case studies include (a) the Boeing/Rocketdyne-provided Electrical Power Simulation (EPSIM), (b) the NASA Automation and Robotics Division-provided TRICK robotics systems model, and (c) the Russian Space Agency- provided Russian Segment Trainer. In each case, there was an initial tendency to dismiss simulation reuse candidates based on an apparent lack of suitability. A more careful examination based on a more structured assessment of architectural and requirements-oriented representations of the reuse candidates revealed significant reuse potential. Specific steps used to conduct the detailed assessments are discussed. The steps include the following: 1) Identifying reuse candidates; 2) Requirements compatibility assessment; 3) Maturity assessment; 4) Life-cycle cost determination; and 5) Risk assessment. Observations and conclusions are presented related to the real cost of system-level simulation component reuse. Finally, lessons learned that relate to maximizing the benefits of space systems simulation reuse are shared. These concepts should be directly applicable for use in the development of space systems simulations in the future.

  3. The EMIR experience in the use of software control simulators to speed up the time to telescope

    NASA Astrophysics Data System (ADS)

    Lopez Ramos, Pablo; López-Ruiz, J. C.; Moreno Arce, Heidy; Rosich, Josefina; Perez Menor, José Maria

    2012-09-01

    One of the main problems facing development teams working on instrument control systems consists on the need to access mechanisms which are not available until well into the integration phase. The need to work with real hardware creates additional problems like, among others: certain faults cannot be tested due to the possibility of hardware damage, taking the system to the limit may shorten its operational lifespan and the full system may not be available during some periods due to maintenance and/or testing of individual components. These problems can be treated with the use of simulators and by applying software/hardware standards. Since information on the construction and performance of electro-mechanical systems is available at relatively early stages of the project, simulators are developed in advance (before the existence of the mechanism) or, if conventions and standards have been correctly followed, a previously developed simulator might be used. This article describes our experience in building software simulators and the main advantages we have identified, which are: the control software can be developed even in the absence of real hardware, critical tests can be prepared using the simulated systems, test system behavior for hardware failure situations that represent a risk of the real system, and the speed up of in house integration of the entire instrument. The use of simulators allows us to reduce development, testing and integration time.

  4. Assessing the impact of future climate extremes on the US corn and soybean production

    NASA Astrophysics Data System (ADS)

    Jin, Z.

    2015-12-01

    Future climate changes will place big challenges to the US agricultural system, among which increasing heat stress and precipitation variability were the two major concerns. Reliable prediction of crop productions in response to the increasingly frequent and severe extreme climate is a prerequisite for developing adaptive strategies on agricultural risk management. However, the progress has been slow on quantifying the uncertainty of computational predictions at high spatial resolutions. Here we assessed the risks of future climate extremes on the US corn and soybean production using the Agricultural Production System sIMulator (APSIM) model under different climate scenarios. To quantify the uncertainty due to conceptual representations of heat, drought and flooding stress in crop models, we proposed a new strategy of algorithm ensemble in which different methods for simulating crop responses to those extreme climatic events were incorporated into the APSIM. This strategy allowed us to isolate irrelevant structure differences among existing crop models but only focus on the process of interest. Future climate inputs were derived from high-spatial-resolution (12km × 12km) Weather Research and Forecasting (WRF) simulations under Representative Concentration Pathways 4.5 (RCP 4.5) and 8.5 (RCP 8.5). Based on crop model simulations, we analyzed the magnitude and frequency of heat, drought and flooding stress for the 21st century. We also evaluated the water use efficiency and water deficit on regional scales if farmers were to boost their yield by applying more fertilizers. Finally we proposed spatially explicit adaptation strategies of irrigation and fertilizing for different management zones.

  5. Nest predation risk influences a cavity-nesting passerine during the post-hatching care period

    PubMed Central

    Yoon, Jongmin; Kim, Byung-Su; Joo, Eun-Jin; Park, Shi-Ryong

    2016-01-01

    Some nest predators visually assess parental activities to locate a prey nest, whereas parents modify fitness-related traits to reduce the probability of nest predation, and/or nestlings fledge early to escape the risky nest environment. Here, we experimentally tested if the parental and fledging behaviours of oriental tits (Parus minor) that bred in the nest-box varied with cavity conditions associated with nest predation risk during the nestling period. The entrance of experimental nest-boxes was enlarged to create a long-term risk soon after clutch competition. A short-term risk, using simulated playbacks with a coexisting control bird and avian nest predator sound, was simultaneously applied to the nest-boxes whether or not the long-term risk existed. We found that the parents reduced their hourly feeding trips, and the nestlings fledged early with the long-term risk, although the nest mortality of the two nest-box types was low and did not differ. While this study presents a portion of prey–predator interactions with the associated uncertainties, our results highlight that the entrance size of cavities for small hole-nesting birds may play an important role in determining their fitness-related traits depending upon the degree of perceived risk of nest predation. PMID:27553176

  6. Nest predation risk influences a cavity-nesting passerine during the post-hatching care period.

    PubMed

    Yoon, Jongmin; Kim, Byung-Su; Joo, Eun-Jin; Park, Shi-Ryong

    2016-08-24

    Some nest predators visually assess parental activities to locate a prey nest, whereas parents modify fitness-related traits to reduce the probability of nest predation, and/or nestlings fledge early to escape the risky nest environment. Here, we experimentally tested if the parental and fledging behaviours of oriental tits (Parus minor) that bred in the nest-box varied with cavity conditions associated with nest predation risk during the nestling period. The entrance of experimental nest-boxes was enlarged to create a long-term risk soon after clutch competition. A short-term risk, using simulated playbacks with a coexisting control bird and avian nest predator sound, was simultaneously applied to the nest-boxes whether or not the long-term risk existed. We found that the parents reduced their hourly feeding trips, and the nestlings fledged early with the long-term risk, although the nest mortality of the two nest-box types was low and did not differ. While this study presents a portion of prey-predator interactions with the associated uncertainties, our results highlight that the entrance size of cavities for small hole-nesting birds may play an important role in determining their fitness-related traits depending upon the degree of perceived risk of nest predation.

  7. Assessing Risks to Sea Otters and the Exxon Valdez Oil Spill: New Scenarios, Attributable Risk, and Recovery

    PubMed Central

    Harwell, Mark A.; Gentile, John H.

    2014-01-01

    The Exxon Valdez oil spill occurred more than two decades ago, and the Prince William Sound ecosystem has essentially recovered. Nevertheless, discussion continues on whether or not localized effects persist on sea otters (Enhydra lutris) at northern Knight Island (NKI) and, if so, what are the associated attributable risks. A recent study estimated new rates of sea otter encounters with subsurface oil residues (SSOR) from the oil spill. We previously demonstrated that a potential pathway existed for exposures to polycyclic aromatic hydrocarbons (PAHs) and conducted a quantitative ecological risk assessment using an individual-based model that simulated this and other plausible exposure pathways. Here we quantitatively update the potential for this exposure pathway to constitute an ongoing risk to sea otters using the new estimates of SSOR encounters. Our conservative model predicted that the assimilated doses of PAHs to the 1-in-1000th most-exposed sea otters would remain 1–2 orders of magnitude below the chronic effects thresholds. We re-examine the baseline estimates, post-spill surveys, recovery status, and attributable risks for this subpopulation. We conclude that the new estimated frequencies of encountering SSOR do not constitute a plausible risk for sea otters at NKI and these sea otters have fully recovered from the oil spill. PMID:24587690

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Richard S.; Carlson, Thomas J.; Welch, Abigail E.

    A multifactor study was conducted by Battelle for the US Army Corps of Engineers to assess the significance of the presence of a radio telemetry transmitter on the effects of rapid decompression from simulated hydro turbine passage on depth acclimated juvenile run-of-the-river Chinook salmon. Study factors were: (1) juvenile chinook salmon age;, subyearling or yearling, (2) radio transmitter present or absent, (3) three transmitter implantation factors: gastric, surgical, and no transmitter, and (4) four acclimation depth factors: 1, 10, 20, and 40 foot submergence equivalent absolute pressure, for a total of 48 unique treatments. Exposed fish were examined for changesmore » in behavior, presence or absence of barotrauma injuries, and immediate or delayed mortality. Logistic models were used to test hypotheses that addressed study objectives. The presence of a radio transmitter was found to significantly increase the risk of barotrauma injury and mortality at exposure to rapid decompression. Gastric implantation was found to present a higher risk than surgical implantation. Fish were exposed within 48 hours of transmitter implantation so surgical incisions were not completely healed. The difference in results obtained for gastric and surgical implantation methods may be the result of study design and the results may have been different if tested fish had completely healed surgical wounds. However, the test did simulate the typical surgical-release time frame for in-river telemetry studies of fish survival so the results are probably representative for fish passing through a turbine shortly following release into the river. The finding of a significant difference in response to rapid decompression between fish bearing radio transmitters and those not implies a bias may exist in estimates of turbine passage survival obtained using radio telemetry. However, the rapid decompression (simulated turbine passage) conditions used for the study represented near worst case exposure for fish passing through turbines. At this time, insufficient data exist about the distribution of river-run fish entering turbines, and particularly, the distribution of fish passing through turbine runners, to extrapolate study findings to the population of fish passing through FCRPS turbines. This study is the first study examining rapid decompression study to include acclimation depth as an experimental factor for physostomous fish. We found that fish acclimated to deeper depth were significantly more vulnerable to barotrauma injury and death. Insufficient information about the distribution of fish entering turbines and their depth acclimation currently exists to extrapolate these findings to the population of fish passing through turbines. However, the risk of barotrauma for turbine-passed fish could be particularly high for subyearling Chinook salmon that migrate downstream at deeper depths late in the early summer portion of the outmigration. Barotrauma injuries led to immediate mortality delayed mortality and potential mortality due to increased susceptibility to predation resulting from loss of equilibrium or swim bladder rupture.« less

  9. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage.

  10. Process Control Migration of 50 LPH Helium Liquefier

    NASA Astrophysics Data System (ADS)

    Panda, U.; Mandal, A.; Das, A.; Behera, M.; Pal, Sandip

    2017-02-01

    Two helium liquefier/refrigerators are operational at VECC while one is dedicated for the Superconducting Cyclotron. The first helium liquefier of 50 LPH capacity from Air Liquide has already completed fifteen years of operation without any major trouble. This liquefier is being controlled by Eurotherm PC3000 make PLC. This PLC has become obsolete since last seven years or so. Though we can still manage to run the PLC system with existing spares, risk of discontinuation of the operation is always there due to unavailability of spare. In order to eliminate the risk, an equivalent PLC control system based on Siemens S7-300 was thought of. For smooth migration, total programming was done keeping the same field input and output interface, nomenclature and graphset. New program is a mix of S7-300 Graph, STL and LAD languages. One to one program verification of the entire process graph was done manually. The total program was run in simulation mode. Matlab mathematical model was also used for plant control simulations. EPICS based SCADA was used for process monitoring. As of now the entire hardware and software is ready for direct replacement with minimum required set up time.

  11. Prioritization Risk Integration Simulation Model (PRISM) For Environmental Remediation and Waste Management - 12097

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pentz, David L.; Stoll, Ralph H.; Greeves, John T.

    2012-07-01

    The PRISM (Prioritization Risk Integration Simulation Model), a computer model was developed to support the Department of Energy's Office of Environmental Management (DOE-EM) in its mission to clean up the environmental legacy from the Nation's nuclear weapons materials production complex. PRISM provides a comprehensive, fully integrated planning tool that can tie together DOE-EM's projects. It is designed to help DOE managers develop sound, risk-informed business practices and defend program decisions. It provides a better ability to understand and manage programmatic risks. The underlying concept for PRISM is that DOE-EM 'owns' a portfolio of environmental legacy obligations (ELOs), and that itsmore » mission is to transform the ELOs from their current conditions to acceptable conditions, in the most effective way possible. There are many types of ELOs - - contaminated soils and groundwater plumes, disused facilities awaiting D and D, and various types of wastes waiting for processing or disposal. For a given suite of planned activities, PRISM simulates the outcomes as they play out over time, allowing for all key identified uncertainties and risk factors. Each contaminated building, land area and waste stream is tracked from cradle to grave, and all of the linkages affecting different waste streams are captured. The progression of the activities is fully dynamic, reflecting DOE-EM's prioritization approaches, precedence requirements, available funding, and the consequences of risks and uncertainties. The top level of PRISM is the end-user interface that allows rapid evaluation of alternative scenarios and viewing the results in a variety of useful ways. PRISM is a fully probabilistic model, allowing the user to specify uncertainties in input data (such as the magnitude of an existing groundwater plume, or the total cost to complete a planned activity) as well as specific risk events that might occur. PRISM is based on the GoldSim software that is widely used for risk and performance assessment calculations. PRISM can be run in a deterministic mode, which quickly provides an estimate of the most likely results of a given plan. Alternatively, the model can be run probabilistically in a Monte Carlo mode, exploring the risks and uncertainties in the system and producing probability distributions for the different performance measures. The PRISM model demonstrates how EM can evaluate a portfolio of ELOs, and transform the ELOs from their current conditions to acceptable conditions, utilizing different strategic approaches. There are many types of ELOs - contaminated soils and groundwater plumes, disused facilities awaiting D and D, and various types of wastes waiting for processing or disposal. This scope of work for the PRISM process and the development of a dynamic simulation model are a logical extension of the GoldSim simulation software used by the OCRWM to assess the long-term performance for the Yucca Mountain Project and by NNSA to assess project risk at its sites. Systems integration modeling will promote better understanding of all project risks, technical and nontechnical, and more defensible decision-making for complex projects with significant uncertainties. It can provide effective visual communication and rapid adaptation during interactions with stakeholders (Administration, Congress, State, Local, and NGO). It will also allow rapid assessment of alternative management approaches. (authors)« less

  12. Refining area of occupancy to address the modifiable areal unit problem in ecology and conservation.

    PubMed

    Moat, Justin; Bachman, Steven P; Field, Richard; Boyd, Doreen S

    2018-05-23

    The 'modifiable areal unit problem' is prevalent across many aspects of spatial analysis within ecology and conservation. The problem is particularly manifest when calculating metrics for extinction risk estimation, for example, area of occupancy (AOO). Although embedded into the International Union for the Conservation of Nature (IUCN) Red List criteria, AOO is often not used or is poorly applied. Here we evaluate new and existing methods for calculating AOO from occurrence records and present a method for determining the minimum AOO using a uniform grid. We evaluate the grid cell shape, grid origin and grid rotation with both real-world and simulated data, reviewing the effects on AOO values, and possible impacts for species already assessed on the IUCN Red List. We show that AOO can vary by up to 80% and a ratio of cells to points of 1:1.21 gives the maximum variation in the number of occupied cells. These findings potentially impact 3% of existing species on the IUCN Red List, as well as species not yet assessed. We show that a new method that combines both grid rotation and moving grid origin gives fast, robust and reproducible results and, in the majority of cases, achieves the minimum AOO. As well as reporting minimum AOO, we outline a confidence interval which should be incorporated into existing tools that support species risk assessment. We also make further recommendations for reporting AOO and other areal measurements within ecology, leading to more robust methods for future species risk assessment. This article is protected by copyright. All rights reserved. © 2018 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  13. Using Computational Approaches to Improve Risk-Stratified Patient Management: Rationale and Methods

    PubMed Central

    Stone, Bryan L; Sakaguchi, Farrant; Sheng, Xiaoming; Murtaugh, Maureen A

    2015-01-01

    Background Chronic diseases affect 52% of Americans and consume 86% of health care costs. A small portion of patients consume most health care resources and costs. More intensive patient management strategies, such as case management, are usually more effective at improving health outcomes, but are also more expensive. To use limited resources efficiently, risk stratification is commonly used in managing patients with chronic diseases, such as asthma, chronic obstructive pulmonary disease, diabetes, and heart disease. Patients are stratified based on predicted risk with patients at higher risk given more intensive care. The current risk-stratified patient management approach has 3 limitations resulting in many patients not receiving the most appropriate care, unnecessarily increased costs, and suboptimal health outcomes. First, using predictive models for health outcomes and costs is currently the best method for forecasting individual patient’s risk. Yet, accuracy of predictive models remains poor causing many patients to be misstratified. If an existing model were used to identify candidate patients for case management, enrollment would miss more than half of those who would benefit most, but include others unlikely to benefit, wasting limited resources. Existing models have been developed under the assumption that patient characteristics primarily influence outcomes and costs, leaving physician characteristics out of the models. In reality, both characteristics have an impact. Second, existing models usually give neither an explanation why a particular patient is predicted to be at high risk nor suggestions on interventions tailored to the patient’s specific case. As a result, many high-risk patients miss some suitable interventions. Third, thresholds for risk strata are suboptimal and determined heuristically with no quality guarantee. Objective The purpose of this study is to improve risk-stratified patient management so that more patients will receive the most appropriate care. Methods This study will (1) combine patient, physician profile, and environmental variable features to improve prediction accuracy of individual patient health outcomes and costs; (2) develop the first algorithm to explain prediction results and suggest tailored interventions; (3) develop the first algorithm to compute optimal thresholds for risk strata; and (4) conduct simulations to estimate outcomes of risk-stratified patient management for various configurations. The proposed techniques will be demonstrated on a test case of asthma patients. Results We are currently in the process of extracting clinical and administrative data from an integrated health care system’s enterprise data warehouse. We plan to complete this study in approximately 5 years. Conclusions Methods developed in this study will help transform risk-stratified patient management for better clinical outcomes, higher patient satisfaction and quality of life, reduced health care use, and lower costs. PMID:26503357

  14. When has service provision for transient ischaemic attack improved enough? A discrete event simulation economic modelling study.

    PubMed

    Barton, Pelham; Sheppard, James P; Penaloza-Ramos, Cristina M; Jowett, Sue; Ford, Gary A; Lasserson, Daniel; Mant, Jonathan; Mellor, Ruth M; Quinn, Tom; Rothwell, Peter M; Sandler, David; Sims, Don; McManus, Richard J

    2017-11-25

    The aim of this study was to examine the impact of transient ischaemic attack (TIA) service modification in two hospitals on costs and clinical outcomes. Discrete event simulation model using data from routine electronic health records from 2011. Patients with suspected TIA were followed from symptom onset to presentation, referral to specialist clinics, treatment and subsequent stroke. Included existing versus previous (less same day clinics) and hypothetical service reconfiguration (7-day service with less availability of clinics per day). The primary outcome of the model was the prevalence of major stroke after TIA. Secondary outcomes included service costs (including those of treating subsequent stroke) and time to treatment and attainment of national targets for service provision (proportion of high-risk patients (according to ABCD 2 score) seen within 24 hours). The estimated costs of previous service provision for 490 patients (aged 74±12 years, 48.9% female and 23.6% high risk) per year at each site were £340 000 and £368 000, respectively. This resulted in 31% of high-risk patients seen within 24 hours of referral (47/150) with a median time from referral to clinic attendance/treatment of 1.15 days (IQR 0.93-2.88). The costs associated with the existing and hypothetical services decreased by £5000 at one site and increased £21 000 at the other site. Target attainment was improved to 79% (118/150). However, the median time to clinic attendance was only reduced to 0.85 days (IQR 0.17-0.99) and thus no appreciable impact on the modelled incidence of major stroke was observed (10.7 per year, 99% CI 10.5 to 10.9 (previous service) vs 10.6 per year, 99% CI 10.4 to 10.8 (existing service)). Reconfiguration of services for TIA is effective at increasing target attainment, but in services which are already working efficiently (treating patients within 1-2 days), it has little estimated impact on clinical outcomes and increased investment may not be worthwhile. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Equity venture capital platform model based on complex network

    NASA Astrophysics Data System (ADS)

    Guo, Dongwei; Zhang, Lanshu; Liu, Miao

    2018-05-01

    This paper uses the small-world network and the random-network to simulate the relationship among the investors, construct the network model of the equity venture capital platform to explore the impact of the fraud rate and the bankruptcy rate on the robustness of the network model while observing the impact of the average path length and the average agglomeration coefficient of the investor relationship network on the income of the network model. The study found that the fraud rate and bankruptcy rate exceeded a certain threshold will lead to network collapse; The bankruptcy rate has a great influence on the income of the platform; The risk premium exists, and the average return is better under a certain range of bankruptcy risk; The structure of the investor relationship network has no effect on the income of the investment model.

  16. Estimation of Recurrence of Colorectal Adenomas with Dependent Censoring Using Weighted Logistic Regression

    PubMed Central

    Hsu, Chiu-Hsieh; Li, Yisheng; Long, Qi; Zhao, Qiuhong; Lance, Peter

    2011-01-01

    In colorectal polyp prevention trials, estimation of the rate of recurrence of adenomas at the end of the trial may be complicated by dependent censoring, that is, time to follow-up colonoscopy and dropout may be dependent on time to recurrence. Assuming that the auxiliary variables capture the dependence between recurrence and censoring times, we propose to fit two working models with the auxiliary variables as covariates to define risk groups and then extend an existing weighted logistic regression method for independent censoring to each risk group to accommodate potential dependent censoring. In a simulation study, we show that the proposed method results in both a gain in efficiency and reduction in bias for estimating the recurrence rate. We illustrate the methodology by analyzing a recurrent adenoma dataset from a colorectal polyp prevention trial. PMID:22065985

  17. FlySec: a risk-based airport security management system based on security as a service concept

    NASA Astrophysics Data System (ADS)

    Kyriazanos, Dimitris M.; Segou, Olga E.; Zalonis, Andreas; Thomopoulos, Stelios C. A.

    2016-05-01

    Complementing the ACI/IATA efforts, the FLYSEC European H2020 Research and Innovation project (http://www.fly-sec.eu/) aims to develop and demonstrate an innovative, integrated and end-to-end airport security process for passengers, enabling a guided and streamlined procedure from the landside to airside and into the boarding gates, and offering for an operationally validated innovative concept for end-to-end aviation security. FLYSEC ambition turns through a well-structured work plan into: (i) innovative processes facilitating risk-based screening; (ii) deployment and integration of new technologies and repurposing existing solutions towards a risk-based Security paradigm shift; (iii) improvement of passenger facilitation and customer service, bringing security as a real service in the airport of tomorrow;(iv) achievement of measurable throughput improvement and a whole new level of Quality of Service; and (v) validation of the results through advanced "in-vitro" simulation and "in-vivo" pilots. On the technical side, FLYSEC achieves its ambitious goals by integrating new technologies on video surveillance, intelligent remote image processing and biometrics combined with big data analysis, open-source intelligence and crowdsourcing. Repurposing existing technologies is also in the FLYSEC objectives, such as mobile application technologies for improved passenger experience and positive boarding applications (i.e. services to facilitate boarding and landside/airside way finding) as well as RFID for carry-on luggage tracking and quick unattended luggage handling. In this paper, the authors will describe the risk based airport security management system which powers FLYSEC intelligence and serves as the backend on top of which FLYSEC's front end technologies reside for security services management, behaviour and risk analysis.

  18. Surgical skills simulation in trauma and orthopaedic training.

    PubMed

    Stirling, Euan R B; Lewis, Thomas L; Ferran, Nicholas A

    2014-12-19

    Changing patterns of health care delivery and the rapid evolution of orthopaedic surgical techniques have made it increasingly difficult for trainees to develop expertise in their craft. Working hour restrictions and a drive towards senior led care demands that proficiency be gained in a shorter period of time whilst requiring a greater skill set than that in the past. The resulting conflict between service provision and training has necessitated the development of alternative methods in order to compensate for the reduction in 'hands-on' experience. Simulation training provides the opportunity to develop surgical skills in a controlled environment whilst minimising risks to patient safety, operating theatre usage and financial expenditure. Many options for simulation exist within orthopaedics from cadaveric or prosthetic models, to arthroscopic simulators, to advanced virtual reality and three-dimensional software tools. There are limitations to this form of training, but it has significant potential for trainees to achieve competence in procedures prior to real-life practice. The evidence for its direct transferability to operating theatre performance is limited but there are clear benefits such as increasing trainee confidence and familiarity with equipment. With progressively improving methods of simulation available, it is likely to become more important in the ongoing and future training and assessment of orthopaedic surgeons.

  19. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data. Its application therefore allows the simulation of data that closely resemble real settings and thus can improve the use of simulation studies for designing and analysing studies.

  20. Estimating associations of mobile phone use and brain tumours taking into account laterality: a comparison and theoretical evaluation of applied methods.

    PubMed

    Frederiksen, Kirsten; Deltour, Isabelle; Schüz, Joachim

    2012-12-10

    Estimating exposure-outcome associations using laterality information on exposure and on outcome is an issue, when estimating associations of mobile phone use and brain tumour risk. The exposure is localized; therefore, a potential risk is expected to exist primarily on the side of the head, where the phone is usually held (ipsilateral exposure), and to a lesser extent at the opposite side of the head (contralateral exposure). Several measures of the associations with ipsilateral and contralateral exposure, dealing with different sampling designs, have been presented in the literature. This paper presents a general framework for the analysis of such studies using a likelihood-based approach in a competing risks model setting. The approach clarifies the implicit assumptions required for the validity of the presented estimators, particularly that in some approaches the risk with contralateral exposure is assumed to be zero. The performance of the estimators is illustrated in a simulation study showing for instance that while in some scenarios there is a loss of statistical power, others - in case of a positive ipsilateral exposure-outcome association - would result in a negatively biased estimate of the contralateral exposure parameter, irrespective of any additional recall bias. In conclusion, our theoretical evaluations and results from the simulation study emphasize the importance of setting up a formal model, which furthermore allows for estimation in more complicated and perhaps more realistic exposure settings, such as taking into account exposure to both sides of the head. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Productivity improvement using discrete events simulation

    NASA Astrophysics Data System (ADS)

    Hazza, M. H. F. Al; Elbishari, E. M. Y.; Ismail, M. Y. Bin; Adesta, E. Y. T.; Rahman, Nur Salihah Binti Abdul

    2018-01-01

    The increasing in complexity of the manufacturing systems has increased the cost of investment in many industries. Furthermore, the theoretical feasibility studies are not enough to take the decision in investing for that particular area. Therefore, the development of the new advanced software is protecting the manufacturer from investing money in production lines that may not be sufficient and effective with their requirement in terms of machine utilization and productivity issue. By conducting a simulation, using accurate model will reduce and eliminate the risk associated with their new investment. The aim of this research is to prove and highlight the importance of simulation in decision-making process. Delmia quest software was used as a simulation program to run a simulation for the production line. A simulation was first done for the existing production line and show that the estimated production rate is 261 units/day. The results have been analysed based on utilization percentage and idle time. Two different scenarios have been proposed based on different objectives. The first scenario is by focusing on low utilization machines and their idle time, this was resulted in minimizing the number of machines used by three with the addition of the works who maintain them without having an effect on the production rate. The second scenario is to increase the production rate by upgrading the curing machine which lead to the increase in the daily productivity by 7% from 261 units to 281 units.

  2. Using virtual reality technology and hand tracking technology to create software for training surgical skills in 3D game

    NASA Astrophysics Data System (ADS)

    Zakirova, A. A.; Ganiev, B. A.; Mullin, R. I.

    2015-11-01

    The lack of visible and approachable ways of training surgical skills is one of the main problems in medical education. Existing simulation training devices are not designed to teach students, and are not available due to the high cost of the equipment. Using modern technologies such as virtual reality and hands movements fixation technology we want to create innovative method of learning the technics of conducting operations in 3D game format, which can make education process interesting and effective. Creating of 3D format virtual simulator will allow to solve several conceptual problems at once: opportunity of practical skills improvement unlimited by the time without the risk for patient, high realism of environment in operational and anatomic body structures, using of game mechanics for information perception relief and memorization of methods acceleration, accessibility of this program.

  3. From individual to population level effects of toxicants in the tubicifid Branchiura sowerbyi using threshold effect models in a Bayesian framework.

    PubMed

    Ducrot, Virginie; Billoir, Elise; Péry, Alexandre R R; Garric, Jeanne; Charles, Sandrine

    2010-05-01

    Effects of zinc were studied in the freshwater worm Branchiura sowerbyi using partial and full life-cycle tests. Only newborn and juveniles were sensitive to zinc, displaying effects on survival, growth, and age at first brood at environmentally relevant concentrations. Threshold effect models were proposed to assess toxic effects on individuals. They were fitted to life-cycle test data using Bayesian inference and adequately described life-history trait data in exposed organisms. The daily asymptotic growth rate of theoretical populations was then simulated with a matrix population model, based upon individual-level outputs. Population-level outputs were in accordance with existing literature for controls. Working in a Bayesian framework allowed incorporating parameter uncertainty in the simulation of the population-level response to zinc exposure, thus increasing the relevance of test results in the context of ecological risk assessment.

  4. Fuel-injector/air-swirl characterization

    NASA Technical Reports Server (NTRS)

    Mcvey, J. B.; Kennedy, J. B.; Bennett, J. C.

    1985-01-01

    The objectives of this program are to establish an experimental data base documenting the behavior of gas turbine engine fuel injector sprays as the spray interacts with the swirling gas flow existing in the combustor dome, and to conduct an assessment of the validity of current analytical techniques for predicting fuel spray behavior. Emphasis is placed on the acquisition of data using injector/swirler components which closely resemble components currently in use in advanced aircraft gas turbine engines, conducting tests under conditions that closely simulate or closely approximate those developed in actual combustors, and conducting a well-controlled experimental effort which will comprise using a combination of low-risk experiments and experiments requiring the use of state-of-the-art diagnostic instrumentation. Analysis of the data is to be conducted using an existing, TEACH-type code which employs a stochastic analysis of the motion of the dispersed phase in the turbulent continuum flow field.

  5. The utilisation of engineered invert traps in the management of near bed solids in sewer networks.

    PubMed

    Ashley, R M; Tait, S J; Stovin, V R; Burrows, R; Framer, A; Buxton, A P; Blackwood, D J; Saul, A J; Blanksby, J R

    2003-01-01

    Large existing sewers are considerable assets which wastewater utilities will require to operate for the foreseeable future to maintain health and the quality of life in cities. Despite their existence for more than a century there is surprisingly little guidance available to manage these systems to minimise problems associated with in-sewer solids. A joint study has been undertaken in the UK, to refine and utilise new knowledge gained from field data, laboratory results and Computational Fluid Dynamics (CFD) simulations to devise cost beneficial engineering tools for the application of small invert traps to localise the deposition of sediments in sewers at accessible points for collection. New guidance has been produced for trap siting and this has been linked to a risk-cost-effectiveness assessment procedure to enable system operators to approach in-sewer sediment management pro-actively rather than reactively as currently happens.

  6. [Modeling in value-based medicine].

    PubMed

    Neubauer, A S; Hirneiss, C; Kampik, A

    2010-03-01

    Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.

  7. Particle swarm optimization based space debris surveillance network scheduling

    NASA Astrophysics Data System (ADS)

    Jiang, Hai; Liu, Jing; Cheng, Hao-Wen; Zhang, Yao

    2017-02-01

    The increasing number of space debris has created an orbital debris environment that poses increasing impact risks to existing space systems and human space flights. For the safety of in-orbit spacecrafts, we should optimally schedule surveillance tasks for the existing facilities to allocate resources in a manner that most significantly improves the ability to predict and detect events involving affected spacecrafts. This paper analyzes two criteria that mainly affect the performance of a scheduling scheme and introduces an artificial intelligence algorithm into the scheduling of tasks of the space debris surveillance network. A new scheduling algorithm based on the particle swarm optimization algorithm is proposed, which can be implemented in two different ways: individual optimization and joint optimization. Numerical experiments with multiple facilities and objects are conducted based on the proposed algorithm, and simulation results have demonstrated the effectiveness of the proposed algorithm.

  8. Economic Evaluation of a Home-Based Age-Related Macular Degeneration Monitoring System.

    PubMed

    Wittenborn, John S; Clemons, Traci; Regillo, Carl; Rayess, Nadim; Liffmann Kruger, Danielle; Rein, David

    2017-05-01

    Medicare recently approved coverage of home telemonitoring for early detection of incident choroidal neovascularization (CNV) among patients with age-related macular degeneration (AMD), but no economic evaluation has yet assessed its cost-effectiveness and budgetary impact. To evaluate a home-based daily visual-field monitoring system using simulation methods and to apply the findings of the Home Monitoring of the Eye study to the US population at high risk for wet-form AMD. In this economic analysis, an evaluation of the potential cost, cost-effectiveness, and government budgetary impact of adoption of a home-based daily visual-field monitoring system among eligible Medicare patients was performed. Effectiveness and visual outcomes data from the Age-Related Eye Disease Study 2 Home Monitoring of the Eye study, treatment data from the Wills Eye Hospital Treat & Extend study, and AMD progression data from the Age-Related Eye Disease Study 1 were used to simulate the long-term effects of telemonitoring patients with CNV in one eye or large drusen and/or pigment abnormalities in both eyes. Univariate and probabilistic sensitivity analysis and an alternative scenario using the Treat & Extend study control group outcomes were used to examine uncertainty in these data and assumptions. Home telemonitoring of patients with AMD for early detection of CNV vs usual care. Incremental cost-effectiveness ratio, net present value of lifetime societal costs, and 10-year nominal government expenditures. Telemonitoring of patients with existing unilateral CNV or multiple bilateral risk factors for CNV (large drusen and retinal pigment abnormalities) incurs $907 (95% CI, -$6302 to $2809) in net lifetime societal costs, costs $1312 (95% CI, $222-$2848) per patient during 10 years from the federal government's perspective, and results in an incremental cost-effectiveness ratio of $35 663 (95% CI, cost savings to $235 613) per quality-adjusted life-year gained. Home telemonitoring of patients with AMD who are at risk for CNV was cost-effective compared with scheduled examinations alone. Monitoring patients with existing CNV in one eye is cost saving, but monitoring is generally not cost-effective among patients with low risk of CNV, including those with no or few risk factors. With Medicare coverage, monitoring incurs budgetary expenditures for the government but is cost-saving for patients at high risk of AMD. Monitoring could be cost saving to society if monitoring reduced the frequency of scheduled examinations or led to a reduction of one or more injections of ranibizumab.

  9. Projected Risk of Flooding Disaster over China in 21st Century Based on CMIP5 Models

    NASA Astrophysics Data System (ADS)

    Li, Rouke; Xu, Ying

    2016-04-01

    Based on the simulations from CMIP5 models, using climate indices which have high correlation with historical disaster data, and in combination with terrain elevation data and the socio-economic data, to project the flooding disaster risk, the vulnerability of flooding hazard affected body and the risk of flooding hazard respectively during the near term(2015-2039), medium term(2045-2069) and long term(2075-2099) under RCP8.5. According to the IPCC AR5 WGII, we used risk evaluation model of disaster: R=E*H*V. R on behalf of disaster risk index. H, E and V express risk, exposure and vulnerability respectively. The results show that the extreme flooding disaster risk will gradually increase during different terms in the future, and regions with high risk level of flooding hazard are might mainly located in southeastern and eastern China. Under the RCP8.5 greenhouse gas emissions scenario, the high risk of flooding disaster in future might mainly appear in eastern part of Sichuan, most of North China, and major of East China. Compared with the baseline period,21st century forward, although the occurrence of floods area changes little, the regional strong risk will increase during the end of the 21st century. Due to the coarse resolution of climate models and the methodology for determining weight coefficients, large uncertainty still exists in the projection of the flooding disaster risk.

  10. Potential effects of existing and proposed groundwater withdrawals on water levels and natural groundwater discharge in Snake Valley, Juab and Millard Counties, Utah, White Pine County, Nevada, and surrounding areas in Utah and Nevada

    USGS Publications Warehouse

    Masbruch, Melissa D.; Gardner, Philip M.

    2014-01-01

    Applications have been filed for several water-right changes and new water rights, with total withdrawals of about 1,800 acre-feet per year, in Snake Valley near Eskdale and Partoun, Utah. The Bureau of Land Management has identified 11 sites where the Bureau of Land Management holds water rights and 7 other springs of interest that could be affected by these proposed groundwater withdrawals. This report presents a hydrogeologic analysis of areas within Snake Valley to assess the potential effects on Bureau of Land Management water rights and other springs of interest resulting from existing and proposed groundwater withdrawals. A previously developed numerical groundwater-flow model was used to quantify potential groundwater drawdown and the capture, or groundwater withdrawals that results in depletion, of natural discharge resulting from existing and proposed groundwater withdrawals within Snake Valley. Existing groundwater withdrawals were simulated for a 50-year period prior to adding the newly proposed withdrawals to bring the model from pre-development conditions to the start of 2014. After this initial 50-year period, existing withdrawals, additional proposed withdrawals, and consequent effects were simulated for periods of 5, 10, 25, 50, and 100 years. Downward trends in water levels measured in wells indicate that the existing groundwater withdrawals in Snake Valley are affecting water levels. The numerical model simulated similar downward trends in water levels. The largest simulated drawdowns caused by existing groundwater withdrawals ranged between 10 and 26 feet and were near the centers of the agricultural areas by Callao, Eskdale, Baker, Garrison, and along the Utah-Nevada state line in southern Snake Valley. The largest simulated water-level declines were at the Bureau of Land Management water-rights sites near Eskdale, Utah, where simulated drawdowns ranged between 2 and 8 feet at the start of 2014. These results were consistent with, but lower than, observations from several wells monitored by the U.S. Geological Survey that indicated water-level declines of 6 to 18 feet near the Eskdale area since the mid-1970s and 1980s. The model cells where the simulated capture of natural groundwater discharge resulting from the existing withdrawals was greatest were those containing Kane Spring, Caine Spring, and Unnamed Spring 5, where existing groundwater withdrawals capture 13 to 29 percent of the total simulated natural discharge in these cells. Simulated drawdown and simulated capture of natural groundwater discharge resulting from the proposed withdrawals started in as few as 5 years at seven of the sites. After 100 years, four sites showed simulated drawdowns ranging between 1 and 2 feet; eight sites showed simulated drawdowns ranging between 0.1 and 0.9 feet; and five sites showed no simulated drawdown resulting from the proposed withdrawals. The largest amounts of simulated capture of natural groundwater discharge resulting from the proposed withdrawals after 100 years were in the model cells containing Coyote Spring, Kane Spring, and Caine Spring, which had capture amounts ranging between 5.5 and 9.1 percent of the total simulated natural discharge in these cells.

  11. Evaluating Approaches to a Coupled Model for Arctic Coastal Erosion, Infrastructure Risk, and Associated Coastal Hazards

    NASA Astrophysics Data System (ADS)

    Frederick, J. M.; Bull, D. L.; Jones, C.; Roberts, J.; Thomas, M. A.

    2016-12-01

    Arctic coastlines are receding at accelerated rates, putting existing and future activities in the developing coastal Arctic environment at extreme risk. For example, at Oliktok Long Range Radar Site, erosion that was not expected until 2040 was reached as of 2014 (Alaska Public Media). As the Arctic Ocean becomes increasingly ice-free, rates of coastal erosion will likely continue to increase as (a) increased ice-free waters generate larger waves, (b) sea levels rise, and (c) coastal permafrost soils warm and lose strength/cohesion. Due to the complex and rapidly varying nature of the Arctic region, little is known about the increasing waves, changing circulation, permafrost soil degradation, and the response of the coastline to changes in these combined conditions. However, as scientific focus has been shifting towards the polar regions, Arctic science is rapidly advancing, increasing our understanding of complex Arctic processes. Our present understanding allows us to begin to develop and evaluate the coupled models necessary for the prediction of coastal erosion in support of Arctic risk assessments. What are the best steps towards the development of a coupled model for Arctic coastal erosion? This work focuses on our current understanding of Arctic conditions and identifying the tools and methods required to develop an integrated framework capable of accurately predicting Arctic coastline erosion and assessing coastal risk and hazards. We will present a summary of the state-of-the-science, and identify existing tools and methods required to develop an integrated diagnostic and monitoring framework capable of accurately predicting and assessing Arctic coastline erosion, infrastructure risk, and coastal hazards. The summary will describe the key coastal processes to simulate, appropriate models to use, effective methods to couple existing models, and identify gaps in knowledge that require further attention to make progress in our understanding of Arctic coastal erosion. * Co-authors listed in alphabetical order. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  12. Reducing the Risks of Military Aircrew Training through Simulation Technology.

    ERIC Educational Resources Information Center

    Farrow, Douglas R.

    1982-01-01

    This discussion of the types of risks associated with military aircrew training and the varieties of training devices and techniques currently utilized to minimize those risks includes an examination of flight trainer simulators and complex mission simulators for coping with military aviation hazards. Four references are listed. (Author/MER)

  13. Modification of Obstetric Emergency Simulation Scenarios for Realism in a Home-Birth Setting.

    PubMed

    Komorowski, Janelle; Andrighetti, Tia; Benton, Melissa

    2017-01-01

    Clinical competency and clear communication are essential for intrapartum care providers who encounter high-stakes, low-frequency emergencies. The challenge for these providers is to maintain infrequently used skills. The challenge is even more significant for midwives who manage births at home and who, due to low practice volume and low-risk clientele, may rarely encounter an emergency. In addition, access to team simulation may be limited for home-birth midwives. This project modified existing validated obstetric simulation scenarios for a home-birth setting. Twelve certified professional midwives (CPMs) in active home-birth practice participated in shoulder dystocia and postpartum hemorrhage simulations. The simulations were staged to resemble home-birth settings, supplies, and personnel. Fidelity (realism) of the simulations was assessed with the Simulation Design Scale, and satisfaction and self-confidence were assessed with the Student Satisfaction and Self-Confidence in Learning Scale. Both utilized a 5-point Likert scale, with higher scores suggesting greater levels of fidelity, participant satisfaction, and self-confidence. Simulation Design Scale scores indicated participants agreed fidelity was achieved for the home-birth setting, while scores on the Student Satisfaction and Self-Confidence in Learning indicated high levels of participant satisfaction and self-confidence. If offered without modification, simulation scenarios designed for use in hospitals may lose fidelity for home-birth midwives, particularly in the environmental and psychological components. Simulation is standard of care in most settings, an excellent vehicle for maintaining skills, and some evidence suggests it results in improved perinatal outcomes. Additional study is needed in this area to support home-birth providers in maintaining skills. This pilot study suggests that simulation scenarios intended for hospital use can be successfully adapted to the home-birth setting. © 2016 by the American College of Nurse-Midwives.

  14. From plot to regional scales: Effect of land use and soil type on soil erosion in the southern Amazon

    NASA Astrophysics Data System (ADS)

    Schindewolf, Marcus; Schultze, Nico; Amorim, Ricardo S. S.; Schmidt, Jürgen

    2015-04-01

    The corridor along the Brazilian Highway 163 in the Southern Amazon is affected by radical changes in land use patterns. In order to enable a model based assessment of erosion risks on different land use and soil types a transportable disc type rainfall simulator is applied to identify the most important infiltration and erosion parameters of the EROSION 3D model. Since particle detachment highly depends on experimental plot length, a combined runoff supply is used for the virtually extension of the plot length to more than 20 m. Simulations were conducted on the most common regional land use, soil management and soil types for dry and wet runs. The experiments are characterized by high final infiltration rates (0.3 - 2.5 mm*min^-1), low sediment concentrations (0.2-6.5 g*L^-1) and accordingly low soil loss rates (0.002-50 Kg*m^-2), strongly related to land use, applied management and soil type. Ploughed pastures and clear cuts reveal highest soil losses whereas croplands are less affected. Due to higher aggregate stabilities Ferrasols are less endangered than Acrisols. Derived model parameters are plausible, comparable to existing data bases and reproduce the effects of land use and soil management on soil loss. Thus it is possible to apply the EROSION 3D soil loss model in Southern Amazonia for erosion risk assessment and scenario simulation under changing climate and land use conditions.

  15. An early warning system for marine storm hazard mitigation

    NASA Astrophysics Data System (ADS)

    Vousdoukas, M. I.; Almeida, L. P.; Pacheco, A.; Ferreira, O.

    2012-04-01

    The present contribution presents efforts towards the development of an operational Early Warning System for storm hazard prediction and mitigation. The system consists of a calibrated nested-model train which consists of specially calibrated Wave Watch III, SWAN and XBeach models. The numerical simulations provide daily forecasts of the hydrodynamic conditions, morphological change and overtopping risk at the area of interest. The model predictions are processed by a 'translation' module which is based on site-specific Storm Impact Indicators (SIIs) (Ciavola et al., 2011, Storm impacts along European coastlines. Part 2: lessons learned from the MICORE project, Environmental Science & Policy, Vol 14), and warnings are issued when pre-defined threshold values are exceeded. For the present site the selected SIIs were (i) the maximum wave run-up height during the simulations; and (ii) the dune-foot horizontal retreat at the end of the simulations. Both SIIs and pre-defined thresholds were carefully selected on the grounds of existing experience and field data. Four risk levels were considered, each associated with an intervention approach, recommended to the responsible coastal protection authority. Regular updating of the topography/bathymetry is critical for the performance of the storm impact forecasting, especially when there are significant morphological changes. The system can be extended to other critical problems, like implications of global warming and adaptive management strategies, while the approach presently followed, from model calibration to the early warning system for storm hazard mitigation, can be applied to other sites worldwide, with minor adaptations.

  16. Forecasting the Risks of Pollution from Ships along the Portuguese Coast

    NASA Astrophysics Data System (ADS)

    Fernandes, Rodrigo; Neves, Ramiro; Lourenço, Filipe; Braunschweig, Frank

    2013-04-01

    Pollution risks in coastal and marine environments are in general based in a static approach, considering historical data, reference situations, and typical scenarios. This approach is quite important in a planning stage. However, an alternative approach can be studied, due to the latest implementation of several different real-time monitoring tools as well as faster performances in the generation of numerical forecasts for metocean properties and trajectories of pollutants spilt at sea or costal zones. These developments provide the possibility of developing an integrated support system for better decision-making in emergency or planning issues associated to pollution risks. An innovative methodology to dynamically produce quantified risks in real-time, integrating best available information from numerical forecasts and the existing monitoring tools, has been developed and applied to the Portuguese Coast. The developed system provides coastal pollution risk levels associated to potential (or real) oil spill incidents from ship collision, grounding or foundering, taking into account regional statistic information on vessel accidents and coastal sensitivity indexes, real-time vessel information (positioning, cargo type, speed and vessel type) obtained from AIS, best-available metocean numerical forecasts (hydrodynamics, meteorology - including visibility, wave conditions) and simulated scenarios by the oil spill fate and behaviour component of MOHID Water Modelling System. Different spill fate and behaviour simulations are continuously generated and processed in background (assuming hypothetical spills from vessels), based on variable vessel information and metocean conditions. Results from these simulations are used in the quantification of consequences of potential spills. All historic information is continuously stored in a database (for risk analysis at a later stage). This dynamic approach improves the accuracy in quantification of consequences to the shoreline, as well as the decision support model, allowing a more effective prioritization of individual ships and geographical areas. This system was initially implemented in Portugal for oil spills. The implementation in other Atlantic Regions (starting in Galician Coast, Spain) is being executed in the scope of ARCOPOL+ project (2011-1/150), as well as other relevant updates. The system is being adapted to include risk modelling of chemical spills, as well as fire & explosion accidents and operational illegal discharges. Also the integration of EMSA's THETIS "ship risk profile" (according to Annex 7 from Paris Memorandum of Understanding) in the risk model is being tested. Finally, a new component is being developed to compute the risk for specific time periods, taking advantage of the information previously stored in the database on the positioning of vessels and / or results of numerical models. This component provides the possibility of obtaining a support tool for detailed characterization of risk profiles in certain periods or a sensitivity analysis on different parameters.

  17. How primary care physicians' attitudes toward risk and uncertainty affect their use of electronic information resources.

    PubMed

    McKibbon, K Ann; Fridsma, Douglas B; Crowley, Rebecca S

    2007-04-01

    The research sought to determine if primary care physicians' attitudes toward risk taking or uncertainty affected how they sought information and used electronic information resources when answering simulated clinical questions. Using physician-supplied data collected from existing risk and uncertainty scales, twenty-five physicians were classified as risk seekers (e.g., enjoying adventure), risk neutral, or risk avoiders (e.g., cautious) and stressed or unstressed by uncertainty. The physicians then answered twenty-three multiple-choice, clinically focused questions and selected two to pursue further using their own information resources. Think-aloud protocols were used to collect searching process and outcome data (e.g., searching time, correctness of answers, searching techniques). No differences in searching outcomes were observed between the groups. Physicians who were risk avoiding and those who reported stress when faced with uncertainty each showed differences in searching processes (e.g., actively analyzing retrieval, using searching heuristics or rules). Physicians who were risk avoiding tended to use resources that provided answers and summaries, such as Cochrane or UpToDate, less than risk-seekers did. Physicians who reported stress when faced with uncertainty showed a trend toward less frequent use of MEDLINE, when compared with physicians who were not stressed by uncertainty. Physicians' attitudes towards risk taking and uncertainty were associated with different searching processes but not outcomes. Awareness of differences in physician attitudes may be key in successful design and implementation of clinical information resources.

  18. A simulation model of IT risk on program trading

    NASA Astrophysics Data System (ADS)

    Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan

    2015-12-01

    The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.

  19. Evaluating common de-identification heuristics for personal health information.

    PubMed

    El Emam, Khaled; Jabbouri, Sam; Sams, Scott; Drouet, Youenn; Power, Michael

    2006-11-21

    With the growing adoption of electronic medical records, there are increasing demands for the use of this electronic clinical data in observational research. A frequent ethics board requirement for such secondary use of personal health information in observational research is that the data be de-identified. De-identification heuristics are provided in the Health Insurance Portability and Accountability Act Privacy Rule, funding agency and professional association privacy guidelines, and common practice. The aim of the study was to evaluate whether the re-identification risks due to record linkage are sufficiently low when following common de-identification heuristics and whether the risk is stable across sample sizes and data sets. Two methods were followed to construct identification data sets. Re-identification attacks were simulated on these. For each data set we varied the sample size down to 30 individuals, and for each sample size evaluated the risk of re-identification for all combinations of quasi-identifiers. The combinations of quasi-identifiers that were low risk more than 50% of the time were considered stable. The identification data sets we were able to construct were the list of all physicians and the list of all lawyers registered in Ontario, using 1% sampling fractions. The quasi-identifiers of region, gender, and year of birth were found to be low risk more than 50% of the time across both data sets. The combination of gender and region was also found to be low risk more than 50% of the time. We were not able to create an identification data set for the whole population. Existing Canadian federal and provincial privacy laws help explain why it is difficult to create an identification data set for the whole population. That such examples of high re-identification risk exist for mainstream professions makes a strong case for not disclosing the high-risk variables and their combinations identified here. For professional subpopulations with published membership lists, many variables often needed by researchers would have to be excluded or generalized to ensure consistently low re-identification risk. Data custodians and researchers need to consider other statistical disclosure techniques for protecting privacy.

  20. Evaluating Common De-Identification Heuristics for Personal Health Information

    PubMed Central

    Jabbouri, Sam; Sams, Scott; Drouet, Youenn; Power, Michael

    2006-01-01

    Background With the growing adoption of electronic medical records, there are increasing demands for the use of this electronic clinical data in observational research. A frequent ethics board requirement for such secondary use of personal health information in observational research is that the data be de-identified. De-identification heuristics are provided in the Health Insurance Portability and Accountability Act Privacy Rule, funding agency and professional association privacy guidelines, and common practice. Objective The aim of the study was to evaluate whether the re-identification risks due to record linkage are sufficiently low when following common de-identification heuristics and whether the risk is stable across sample sizes and data sets. Methods Two methods were followed to construct identification data sets. Re-identification attacks were simulated on these. For each data set we varied the sample size down to 30 individuals, and for each sample size evaluated the risk of re-identification for all combinations of quasi-identifiers. The combinations of quasi-identifiers that were low risk more than 50% of the time were considered stable. Results The identification data sets we were able to construct were the list of all physicians and the list of all lawyers registered in Ontario, using 1% sampling fractions. The quasi-identifiers of region, gender, and year of birth were found to be low risk more than 50% of the time across both data sets. The combination of gender and region was also found to be low risk more than 50% of the time. We were not able to create an identification data set for the whole population. Conclusions Existing Canadian federal and provincial privacy laws help explain why it is difficult to create an identification data set for the whole population. That such examples of high re-identification risk exist for mainstream professions makes a strong case for not disclosing the high-risk variables and their combinations identified here. For professional subpopulations with published membership lists, many variables often needed by researchers would have to be excluded or generalized to ensure consistently low re-identification risk. Data custodians and researchers need to consider other statistical disclosure techniques for protecting privacy. PMID:17213047

  1. Adding an Intelligent Tutoring System to an Existing Training Simulation

    DTIC Science & Technology

    2006-01-01

    to apply information in a job should be the goal of training. Also, conventional IMI is not able to meaningfully incorporate use of free - play simulators...incorporating desktop free - play simulators into computer-based training since the software can stand in for a human tutor in all the roles. Existing IMI...2. ITS can integrate free - play simulators and IMI BC2010 ITS DESCRIPTION Overview Figure 3 illustrates the interaction between BC2010, ITS

  2. Modular programming for tuberculosis control, the "AuTuMN" platform.

    PubMed

    Trauer, James McCracken; Ragonnet, Romain; Doan, Tan Nhut; McBryde, Emma Sue

    2017-08-07

    Tuberculosis (TB) is now the world's leading infectious killer and major programmatic advances will be needed if we are to meet the ambitious new End TB Targets. Although mathematical models are powerful tools for TB control, such models must be flexible enough to capture the complexity and heterogeneity of the global TB epidemic. This includes simulating a disease that affects age groups and other risk groups differently, has varying levels of infectiousness depending upon the organ involved and varying outcomes from treatment depending on the drug resistance pattern of the infecting strain. We adopted sound basic principles of software engineering to develop a modular software platform for simulation of TB control interventions ("AuTuMN"). These included object-oriented programming, logical linkage between modules and consistency of code syntax and variable naming. The underlying transmission dynamic model incorporates optional stratification by age, risk group, strain and organ involvement, while our approach to simulating time-variant programmatic parameters better captures the historical progression of the epidemic. An economic model is overlaid upon this epidemiological model which facilitates comparison between new and existing technologies. A "Model runner" module allows for predictions of future disease burden trajectories under alternative scenario situations, as well as uncertainty, automatic calibration, cost-effectiveness and optimisation. The model has now been used to guide TB control strategies across a range of settings and countries, with our modular approach enabling repeated application of the tool without the need for extensive modification for each application. The modular construction of the platform minimises errors, enhances readability and collaboration between multiple programmers and enables rapid adaptation to answer questions in a broad range of contexts without the need for extensive re-programming. Such features are particularly important in simulating an epidemic as complex and diverse as TB.

  3. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  4. Integrating geophysical data for mapping the contamination of industrial sites by polycyclic aromatic hydrocarbons: A geostatistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colin, P.; Nicoletis, S.; Froidevaux, R.

    1996-12-31

    A case study is presented of building a map showing the probability that the concentration in polycyclic aromatic hydrocarbon (PAH) exceeds a critical threshold. This assessment is based on existing PAH sample data (direct information) and on an electrical resistivity survey (indirect information). Simulated annealing is used to build a model of the range of possible values for PAH concentrations and of the bivariate relationship between PAH concentrations and electrical resistivity. The geostatistical technique of simple indicator kriging is then used, together with the probabilistic model, to infer, at each node of a grid, the range of possible values whichmore » the PAH concentration can take. The risk map is then extracted for this characterization of the local uncertainty. The difference between this risk map and a traditional iso-concentration map is then discussed in terms of decision-making.« less

  5. Falls Risk and Simulated Driving Performance in Older Adults

    PubMed Central

    Gaspar, John G.; Neider, Mark B.; Kramer, Arthur F.

    2013-01-01

    Declines in executive function and dual-task performance have been related to falls in older adults, and recent research suggests that older adults at risk for falls also show impairments on real-world tasks, such as crossing a street. The present study examined whether falls risk was associated with driving performance in a high-fidelity simulator. Participants were classified as high or low falls risk using the Physiological Profile Assessment and completed a number of challenging simulated driving assessments in which they responded quickly to unexpected events. High falls risk drivers had slower response times (~2.1 seconds) to unexpected events compared to low falls risk drivers (~1.7 seconds). Furthermore, when asked to perform a concurrent cognitive task while driving, high falls risk drivers showed greater costs to secondary task performance than did low falls risk drivers, and low falls risk older adults also outperformed high falls risk older adults on a computer-based measure of dual-task performance. Our results suggest that attentional differences between high and low falls risk older adults extend to simulated driving performance. PMID:23509627

  6. Reproduction in Risky Environments: The Role of Invasive Egg Predators in Ladybird Laying Strategies

    PubMed Central

    Paul, Sarah C.; Pell, Judith K.; Blount, Jonathan D.

    2015-01-01

    Reproductive environments are variable and the resources available for reproduction are finite. If reliable cues about the environment exist, mothers can alter offspring phenotype in a way that increases both offspring and maternal fitness (‘anticipatory maternal effects’—AMEs). Strategic use of AMEs is likely to be important in chemically defended species, where the risk of offspring predation may be modulated by maternal investment in offspring toxin level, albeit at some cost to mothers. Whether mothers adjust offspring toxin levels in response to variation in predation risk is, however, unknown, but is likely to be important when assessing the response of chemically defended species to the recent and pervasive changes in the global predator landscape, driven by the spread of invasive species. Using the chemically defended two-spot ladybird, Adalia bipunctata, we investigated reproductive investment, including egg toxin level, under conditions that varied in the degree of simulated offspring predation risk from larval harlequin ladybirds, Harmonia axyridis. H. axyridis is a highly voracious alien invasive species in the UK and a significant intraguild predator of A. bipunctata. Females laid fewer, larger egg clusters, under conditions of simulated predation risk (P+) than when predator cues were absent (P-), but there was no difference in toxin level between the two treatments. Among P- females, when mean cluster size increased there were concomitant increases in both the mass and toxin concentration of eggs, however when P+ females increased cluster size there was no corresponding increase in egg toxin level. We conclude that, in the face of offspring predation risk, females either withheld toxins or were physiologically constrained, leading to a trade-off between cluster size and egg toxin level. Our results provide the first demonstration that the risk of offspring predation by a novel invasive predator can influence maternal investment in toxins within their offspring. PMID:26488753

  7. Car accidents induced by a bottleneck

    NASA Astrophysics Data System (ADS)

    Marzoug, Rachid; Echab, Hicham; Ez-Zahraouy, Hamid

    2017-12-01

    Based on the Nagel-Schreckenberg model (NS) we study the probability of car accidents to occur (Pac) at the entrance of the merging part of two roads (i.e. junction). The simulation results show that the existence of non-cooperative drivers plays a chief role, where it increases the risk of collisions in the intermediate and high densities. Moreover, the impact of speed limit in the bottleneck (Vb) on the probability Pac is also studied. This impact depends strongly on the density, where, the increasing of Vb enhances Pac in the low densities. Meanwhile, it increases the road safety in the high densities. The phase diagram of the system is also constructed.

  8. Numerical simulation of aerobic exercise as a countermeasure in human spaceflight

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    The objective of this work is to analyse the efficacy of long-term regular exercise on relevant cardiovascular parameters when the human body is also exposed to microgravity. Computer simulations are an important tool which may be used to predict and analyse these possible effects, and compare them with in-flight experiments. We based our study on a electrical-like computer model (NELME: Numerical Evaluation of Long-term Microgravity Effects) which was developed in our laboratory and validated with the available data, focusing on the cardiovascu-lar parameters affected by changes in gravity exposure. NELME is based on an electrical-like control system model of the physiological changes, that are known to take place when grav-ity changes are applied. The computer implementation has a modular architecture. Hence, different output parameters, potential effects, organs and countermeasures can be easily imple-mented and evaluated. We added to the previous cardiovascular system module a perturbation module to evaluate the effect of regular exercise on the output parameters previously studied. Therefore, we simulated a well-known countermeasure with different protocols of exercising, as a pattern of input electric-like perturbations on the basic module. Different scenarios have been numerically simulated for both men and women, in different patterns of microgravity, reduced gravity and time exposure. Also EVAs were simulated as perturbations to the system. Results show slight differences in gender, with more risk reduction for women than for men after following an aerobic exercise pattern during a simulated mission. Also, risk reduction of a cardiovascular malfunction is evaluated, with a ceiling effect found in all scenarios. A turning point in vascular resistance for a long-term exposure of microgravity below 0.4g has been found of particular interest. In conclusion, we show that computer simulations are a valuable tool to analyse different effects of long-term microgravity exposure on the human body. Potential countermeasures such as physical exercise can also be evaluated as an induced perturbation into the system. Relevant results are compatible with existing data, and are of valuable interest as an assessment of the efficacy of aerobic exercise as a countermeasure in future missions to Mars.

  9. A simulation framework for mapping risks in clinical processes: the case of in-patient transfers.

    PubMed

    Dunn, Adam G; Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne

    2011-05-01

    To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols.

  10. Use of Ground Motion Simulations of a Historical Earthquake for the Assessment of Past and Future Urban Risks

    NASA Astrophysics Data System (ADS)

    Kentel, E.; Çelik, A.; karimzadeh Naghshineh, S.; Askan, A.

    2017-12-01

    Erzincan city located in the Eastern part of Turkey at the conjunction of three active faults is one of the most hazardous regions in the world. In addition to several historical events, this city has experienced one of the largest earthquakes during the last century: The 27 December 1939 (Ms=8.0) event. With limited knowledge of the tectonic structure by then, the city center was relocated to the North after the 1939 earthquake by almost 5km, indeed closer to the existing major strike slip fault. This decision coupled with poor construction technologies, led to severe damage during a later event that occurred on 13 March 1992 (Mw=6.6). The 1939 earthquake occurred in the pre-instrumental era in the region with no available local seismograms whereas the 1992 event was only recorded by 3 nearby stations. There are empirical isoseismal maps from both events indicating indirectly the spatial distribution of the damage. In this study, we focus on this region and present a multidisciplinary approach to discuss the different components of uncertainties involved in the assessment and mitigation of seismic risk in urban areas. For this initial attempt, ground motion simulation of the 1939 event is performed to obtain the anticipated ground motions and shaking intensities. Using these quantified results along with the spatial distribution of the observed damage, the relocation decision is assessed and suggestions are provided for future large earthquakes to minimize potential earthquake risks.

  11. Medical mitigation model: quantifying the benefits of the public health response to a chemical terrorism attack.

    PubMed

    Good, Kevin; Winkel, David; VonNiederhausern, Michael; Hawkins, Brian; Cox, Jessica; Gooding, Rachel; Whitmire, Mark

    2013-06-01

    The Chemical Terrorism Risk Assessment (CTRA) and Chemical Infrastructure Risk Assessment (CIRA) are programs that estimate the risk of chemical terrorism attacks to help inform and improve the US defense posture against such events. One aspect of these programs is the development and advancement of a Medical Mitigation Model-a mathematical model that simulates the medical response to a chemical terrorism attack and estimates the resulting number of saved or benefited victims. At the foundation of the CTRA/CIRA Medical Mitigation Model is the concept of stock-and-flow modeling; "stocks" are states that individuals progress through during the event, while "flows" permit and govern movement from one stock to another. Using this approach, the model is able to simulate and track individual victims as they progress from exposure to an end state. Some of the considerations in the model include chemical used, type of attack, route and severity of exposure, response-related delays, detailed treatment regimens with efficacy defined as a function of time, medical system capacity, the influx of worried well individuals, and medical countermeasure availability. As will be demonstrated, the output of the CTRA/CIRA Medical Mitigation Model makes it possible to assess the effectiveness of the existing public health response system and develop and examine potential improvement strategies. Such a modeling and analysis capability can be used to inform first-responder actions/training, guide policy decisions, justify resource allocation, and direct knowledge-gap studies.

  12. Assessment and application of national environmental databases and mapping tools at the local level to two community case studies.

    PubMed

    Hammond, Davyda; Conlon, Kathryn; Barzyk, Timothy; Chahine, Teresa; Zartarian, Valerie; Schultz, Brad

    2011-03-01

    Communities are concerned over pollution levels and seek methods to systematically identify and prioritize the environmental stressors in their communities. Geographic information system (GIS) maps of environmental information can be useful tools for communities in their assessment of environmental-pollution-related risks. Databases and mapping tools that supply community-level estimates of ambient concentrations of hazardous pollutants, risk, and potential health impacts can provide relevant information for communities to understand, identify, and prioritize potential exposures and risk from multiple sources. An assessment of existing databases and mapping tools was conducted as part of this study to explore the utility of publicly available databases, and three of these databases were selected for use in a community-level GIS mapping application. Queried data from the U.S. EPA's National-Scale Air Toxics Assessment, Air Quality System, and National Emissions Inventory were mapped at the appropriate spatial and temporal resolutions for identifying risks of exposure to air pollutants in two communities. The maps combine monitored and model-simulated pollutant and health risk estimates, along with local survey results, to assist communities with the identification of potential exposure sources and pollution hot spots. Findings from this case study analysis will provide information to advance the development of new tools to assist communities with environmental risk assessments and hazard prioritization. © 2010 Society for Risk Analysis.

  13. Existential Risk and Cost-Effective Biosecurity

    PubMed Central

    Snyder-Beattie, Andrew

    2017-01-01

    In the decades to come, advanced bioweapons could threaten human existence. Although the probability of human extinction from bioweapons may be low, the expected value of reducing the risk could still be large, since such risks jeopardize the existence of all future generations. We provide an overview of biotechnological extinction risk, make some rough initial estimates for how severe the risks might be, and compare the cost-effectiveness of reducing these extinction-level risks with existing biosecurity work. We find that reducing human extinction risk can be more cost-effective than reducing smaller-scale risks, even when using conservative estimates. This suggests that the risks are not low enough to ignore and that more ought to be done to prevent the worst-case scenarios. PMID:28806130

  14. Risk-Return Relationship in a Complex Adaptive System

    PubMed Central

    Song, Kunyu; An, Kenan; Yang, Guang; Huang, Jiping

    2012-01-01

    For survival and development, autonomous agents in complex adaptive systems involving the human society must compete against or collaborate with others for sharing limited resources or wealth, by using different methods. One method is to invest, in order to obtain payoffs with risk. It is a common belief that investments with a positive risk-return relationship (namely, high risk high return and vice versa) are dominant over those with a negative risk-return relationship (i.e., high risk low return and vice versa) in the human society; the belief has a notable impact on daily investing activities of investors. Here we investigate the risk-return relationship in a model complex adaptive system, in order to study the effect of both market efficiency and closeness that exist in the human society and play an important role in helping to establish traditional finance/economics theories. We conduct a series of computer-aided human experiments, and also perform agent-based simulations and theoretical analysis to confirm the experimental observations and reveal the underlying mechanism. We report that investments with a negative risk-return relationship have dominance over those with a positive risk-return relationship instead in such a complex adaptive systems. We formulate the dynamical process for the system's evolution, which helps to discover the different role of identical and heterogeneous preferences. This work might be valuable not only to complexity science, but also to finance and economics, to management and social science, and to physics. PMID:22479416

  15. Two-scale evaluation of remediation technologies for a contaminated site by applying economic input-output life cycle assessment: risk-cost, risk-energy consumption and risk-CO2 emission.

    PubMed

    Inoue, Yasushi; Katayama, Arata

    2011-09-15

    A two-scale evaluation concept of remediation technologies for a contaminated site was expanded by introducing life cycle costing (LCC) and economic input-output life cycle assessment (EIO-LCA). The expanded evaluation index, the rescue number for soil (RN(SOIL)) with LCC and EIO-LCA, comprises two scales, such as risk-cost, risk-energy consumption or risk-CO(2) emission of a remediation. The effectiveness of RN(SOIL) with LCC and EIO-LCA was examined in a typical contamination and remediation scenario in which dieldrin contaminated an agricultural field. Remediation was simulated using four technologies: disposal, high temperature thermal desorption, biopile and landfarming. Energy consumption and CO(2) emission were determined from a life cycle inventory analysis using monetary-based intensity based on an input-output table. The values of RN(SOIL) based on risk-cost, risk-energy consumption and risk-CO(2) emission were calculated, and then rankings of the candidates were compiled according to RN(SOIL) values. A comparison between three rankings showed the different ranking orders. The existence of differences in ranking order indicates that the scales would not have reciprocal compatibility for two-scale evaluation and that each scale should be used independently. The RN(SOIL) with LCA will be helpful in selecting a technology, provided an appropriate scale is determined. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Risk-return relationship in a complex adaptive system.

    PubMed

    Song, Kunyu; An, Kenan; Yang, Guang; Huang, Jiping

    2012-01-01

    For survival and development, autonomous agents in complex adaptive systems involving the human society must compete against or collaborate with others for sharing limited resources or wealth, by using different methods. One method is to invest, in order to obtain payoffs with risk. It is a common belief that investments with a positive risk-return relationship (namely, high risk high return and vice versa) are dominant over those with a negative risk-return relationship (i.e., high risk low return and vice versa) in the human society; the belief has a notable impact on daily investing activities of investors. Here we investigate the risk-return relationship in a model complex adaptive system, in order to study the effect of both market efficiency and closeness that exist in the human society and play an important role in helping to establish traditional finance/economics theories. We conduct a series of computer-aided human experiments, and also perform agent-based simulations and theoretical analysis to confirm the experimental observations and reveal the underlying mechanism. We report that investments with a negative risk-return relationship have dominance over those with a positive risk-return relationship instead in such a complex adaptive systems. We formulate the dynamical process for the system's evolution, which helps to discover the different role of identical and heterogeneous preferences. This work might be valuable not only to complexity science, but also to finance and economics, to management and social science, and to physics.

  17. The Development of Dynamic Human Reliability Analysis Simulations for Inclusion in Risk Informed Safety Margin Characterization Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring

    2015-07-01

    The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describesmore » current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.« less

  18. Model-based analyses to compare health and economic outcomes of cancer control: inclusion of disparities.

    PubMed

    Goldie, Sue J; Daniels, Norman

    2011-09-21

    Disease simulation models of the health and economic consequences of different prevention and treatment strategies can guide policy decisions about cancer control. However, models that also consider health disparities can identify strategies that improve both population health and its equitable distribution. We devised a typology of cancer disparities that considers types of inequalities among black, white, and Hispanic populations across different cancers and characteristics important for near-term policy discussions. We illustrated the typology in the specific example of cervical cancer using an existing disease simulation model calibrated to clinical, epidemiological, and cost data for the United States. We calculated average reduction in cancer incidence overall and for black, white, and Hispanic women under five different prevention strategies (Strategies A1, A2, A3, B, and C) and estimated average costs and life expectancy per woman, and the cost-effectiveness ratio for each strategy. Strategies that may provide greater aggregate health benefit than existing options may also exacerbate disparities. Combining human papillomavirus vaccination (Strategy A2) with current cervical cancer screening patterns (Strategy A1) resulted in an average reduction of 69% in cancer incidence overall but a 71.6% reduction for white women, 68.3% for black women, and 63.9% for Hispanic women. Other strategies targeting risk-based screening to racial and ethnic minorities reduced disparities among racial subgroups and resulted in more equitable distribution of benefits among subgroups (reduction in cervical cancer incidence, white vs. Hispanic women, 69.7% vs. 70.1%). Strategies that employ targeted risk-based screening and new screening algorithms, with or without vaccination (Strategies B and C), provide excellent value. The most effective strategy (Strategy C) had a cost-effectiveness ratio of $28,200 per year of life saved when compared with the same strategy without vaccination. We identify screening strategies for cervical cancer that provide greater aggregate health benefit than existing options, offer excellent cost-effectiveness, and have the biggest positive impact in worst-off groups. The typology proposed here may also be useful in research and policy decisions when trade-offs between fairness and cost-effectiveness are unavoidable.

  19. Model-Based Analyses to Compare Health and Economic Outcomes of Cancer Control: Inclusion of Disparities

    PubMed Central

    Daniels, Norman

    2011-01-01

    Background Disease simulation models of the health and economic consequences of different prevention and treatment strategies can guide policy decisions about cancer control. However, models that also consider health disparities can identify strategies that improve both population health and its equitable distribution. Methods We devised a typology of cancer disparities that considers types of inequalities among black, white, and Hispanic populations across different cancers and characteristics important for near-term policy discussions. We illustrated the typology in the specific example of cervical cancer using an existing disease simulation model calibrated to clinical, epidemiological, and cost data for the United States. We calculated average reduction in cancer incidence overall and for black, white, and Hispanic women under five different prevention strategies (Strategies A1, A2, A3, B, and C) and estimated average costs and life expectancy per woman, and the cost-effectiveness ratio for each strategy. Results Strategies that may provide greater aggregate health benefit than existing options may also exacerbate disparities. Combining human papillomavirus vaccination (Strategy A2) with current cervical cancer screening patterns (Strategy A1) resulted in an average reduction of 69% in cancer incidence overall but a 71.6% reduction for white women, 68.3% for black women, and 63.9% for Hispanic women. Other strategies targeting risk-based screening to racial and ethnic minorities reduced disparities among racial subgroups and resulted in more equitable distribution of benefits among subgroups (reduction in cervical cancer incidence, white vs Hispanic women, 69.7% vs 70.1%). Strategies that employ targeted risk-based screening and new screening algorithms, with or without vaccination (Strategies B and C), provide excellent value. The most effective strategy (Strategy C) had a cost-effectiveness ratio of $28 200 per year of life saved when compared with the same strategy without vaccination. Conclusions We identify screening strategies for cervical cancer that provide greater aggregate health benefit than existing options, offer excellent cost-effectiveness, and have the biggest positive impact in worst-off groups. The typology proposed here may also be useful in research and policy decisions when trade-offs between fairness and cost-effectiveness are unavoidable. PMID:21900120

  20. Design and Simulation of a MEMS Structure for Electrophoretic and Dielectrophoretic Separation of Particles by Contactless Electrodes

    NASA Technical Reports Server (NTRS)

    Shaw, Harry C.

    2007-01-01

    Rapid identification of pathogenic bacterial species is an important factor in combating public health problems such as E. coli contamination. Food and waterborne pathogens account for sickness in 76 million people annually (CDC). Diarrheagenic E. coli is a major source of gastrointestinal illness. Severe sepsis and Septicemia within the hospital environment are also major problems. 75 1,000 cases annually with a 30-50% mortality rate (Crit Care Med, July '01, Vol. 29, 1303-10). Patient risks run the continuum from fever to organ failure and death. Misdiagnosis or inappropriate treatment increases mortality. There exists a need for rapid screening of samples for identification of pathogenic species (Certain E. coli strains are essential for health). Critical to the identification process is the ability to isolate analytes of interest rapidly. This poster discusses novel devices for the separation of particles on the basis of the dielectric properties, mass and surface charge characteristics is presented. Existing designs involve contact between electrode surfaces and analyte medium resulting in contamination of the electrode bearing elements Two different device designs using different bulk micromachining MEMS processes (PolyMUMPS and a PyrexBIGold electrode design) are presented. These designs cover a range of particle sizes from small molecules through eucaryotic cells. The application of separation of bacteria is discussed in detail. Simulation data for electrostatic and microfluidic characteristics are provided. Detailed design characteristics and physical features of the as fabricated PolyMUMPS design are provided. Analysis of the simulation data relative to the expected performance of the devices will be provided and subsequent conclusions discussed.

  1. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  2. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach

    PubMed Central

    Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.

    2014-01-01

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514

  3. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    PubMed

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  4. A Gaussian random field model for similarity-based smoothing in Bayesian disease mapping.

    PubMed

    Baptista, Helena; Mendes, Jorge M; MacNab, Ying C; Xavier, Miguel; Caldas-de-Almeida, José

    2016-08-01

    Conditionally specified Gaussian Markov random field (GMRF) models with adjacency-based neighbourhood weight matrix, commonly known as neighbourhood-based GMRF models, have been the mainstream approach to spatial smoothing in Bayesian disease mapping. In the present paper, we propose a conditionally specified Gaussian random field (GRF) model with a similarity-based non-spatial weight matrix to facilitate non-spatial smoothing in Bayesian disease mapping. The model, named similarity-based GRF, is motivated for modelling disease mapping data in situations where the underlying small area relative risks and the associated determinant factors do not vary systematically in space, and the similarity is defined by "similarity" with respect to the associated disease determinant factors. The neighbourhood-based GMRF and the similarity-based GRF are compared and accessed via a simulation study and by two case studies, using new data on alcohol abuse in Portugal collected by the World Mental Health Survey Initiative and the well-known lip cancer data in Scotland. In the presence of disease data with no evidence of positive spatial correlation, the simulation study showed a consistent gain in efficiency from the similarity-based GRF, compared with the adjacency-based GMRF with the determinant risk factors as covariate. This new approach broadens the scope of the existing conditional autocorrelation models. © The Author(s) 2016.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Prescott, Steven; Coleman, Justin

    This report describes the current progress and status related to the Industry Application #2 focusing on External Hazards. For this industry application within the Light Water Reactor Sustainability (LWRS) Program Risk-Informed Safety Margin Characterization (RISMC) R&D Pathway, we will create the Risk-Informed Margin Management (RIMM) approach to represent meaningful (i.e., realistic facility representation) event scenarios and consequences by using an advanced 3D facility representation that will evaluate external hazards such as flooding and earthquakes in order to identify, model and analyze the appropriate physics that needs to be included to determine plant vulnerabilities related to external events; manage the communicationmore » and interactions between different physics modeling and analysis technologies; and develop the computational infrastructure through tools related to plant representation, scenario depiction, and physics prediction. One of the unique aspects of the RISMC approach is how it couples probabilistic approaches (the scenario) with mechanistic phenomena representation (the physics) through simulation. This simulation-based modeling allows decision makers to focus on a variety of safety, performance, or economic metrics. In this report, we describe the evaluation of various physics toolkits related to flooding representation. Ultimately, we will be coupling the flooding representation with other events such as earthquakes in order to provide coupled physics analysis for scenarios where interactions exist.« less

  6. A joint frailty-copula model between tumour progression and death for meta-analysis.

    PubMed

    Emura, Takeshi; Nakatochi, Masahiro; Murotani, Kenta; Rondeau, Virginie

    2017-12-01

    Dependent censoring often arises in biomedical studies when time to tumour progression (e.g., relapse of cancer) is censored by an informative terminal event (e.g., death). For meta-analysis combining existing studies, a joint survival model between tumour progression and death has been considered under semicompeting risks, which induces dependence through the study-specific frailty. Our paper here utilizes copulas to generalize the joint frailty model by introducing additional source of dependence arising from intra-subject association between tumour progression and death. The practical value of the new model is particularly evident for meta-analyses in which only a few covariates are consistently measured across studies and hence there exist residual dependence. The covariate effects are formulated through the Cox proportional hazards model, and the baseline hazards are nonparametrically modeled on a basis of splines. The estimator is then obtained by maximizing a penalized log-likelihood function. We also show that the present methodologies are easily modified for the competing risks or recurrent event data, and are generalized to accommodate left-truncation. Simulations are performed to examine the performance of the proposed estimator. The method is applied to a meta-analysis for assessing a recently suggested biomarker CXCL12 for survival in ovarian cancer patients. We implement our proposed methods in R joint.Cox package.

  7. Age- and sex-specific thorax finite element model development and simulation.

    PubMed

    Schoell, Samantha L; Weaver, Ashley A; Vavalle, Nicholas A; Stitzel, Joel D

    2015-01-01

    The shape, size, bone density, and cortical thickness of the thoracic skeleton vary significantly with age and sex, which can affect the injury tolerance, especially in at-risk populations such as the elderly. Computational modeling has emerged as a powerful and versatile tool to assess injury risk. However, current computational models only represent certain ages and sexes in the population. The purpose of this study was to morph an existing finite element (FE) model of the thorax to depict thorax morphology for males and females of ages 30 and 70 years old (YO) and to investigate the effect on injury risk. Age- and sex-specific FE models were developed using thin-plate spline interpolation. In order to execute the thin-plate spline interpolation, homologous landmarks on the reference, target, and FE model are required. An image segmentation and registration algorithm was used to collect homologous rib and sternum landmark data from males and females aged 0-100 years. The Generalized Procrustes Analysis was applied to the homologous landmark data to quantify age- and sex-specific isolated shape changes in the thorax. The Global Human Body Models Consortium (GHBMC) 50th percentile male occupant model was morphed to create age- and sex-specific thoracic shape change models (scaled to a 50th percentile male size). To evaluate the thoracic response, 2 loading cases (frontal hub impact and lateral impact) were simulated to assess the importance of geometric and material property changes with age and sex. Due to the geometric and material property changes with age and sex, there were observed differences in the response of the thorax in both the frontal and lateral impacts. Material property changes alone had little to no effect on the maximum thoracic force or the maximum percent compression. With age, the thorax becomes stiffer due to superior rotation of the ribs, which can result in increased bone strain that can increase the risk of fracture. For the 70-YO models, the simulations predicted a higher number of rib fractures in comparison to the 30-YO models. The male models experienced more superior rotation of the ribs in comparison to the female models, which resulted in a higher number of rib fractures for the males. In this study, age- and sex-specific thoracic models were developed and the biomechanical response was studied using frontal and lateral impact simulations. The development of these age- and sex-specific FE models of the thorax will lead to an improved understanding of the complex relationship between thoracic geometry, age, sex, and injury risk.

  8. The Impact of Variability of Selected Geological and Mining Parameters on the Value and Risks of Projects in the Hard Coal Mining Industry

    NASA Astrophysics Data System (ADS)

    Kopacz, Michał

    2017-09-01

    The paper attempts to assess the impact of variability of selected geological (deposit) parameters on the value and risks of projects in the hard coal mining industry. The study was based on simulated discounted cash flow analysis, while the results were verified for three existing bituminous coal seams. The Monte Carlo simulation was based on nonparametric bootstrap method, while correlations between individual deposit parameters were replicated with use of an empirical copula. The calculations take into account the uncertainty towards the parameters of empirical distributions of the deposit variables. The Net Present Value (NPV) and the Internal Rate of Return (IRR) were selected as the main measures of value and risk, respectively. The impact of volatility and correlation of deposit parameters were analyzed in two aspects, by identifying the overall effect of the correlated variability of the parameters and the indywidual impact of the correlation on the NPV and IRR. For this purpose, a differential approach, allowing determining the value of the possible errors in calculation of these measures in numerical terms, has been used. Based on the study it can be concluded that the mean value of the overall effect of the variability does not exceed 11.8% of NPV and 2.4 percentage points of IRR. Neglecting the correlations results in overestimating the NPV and the IRR by up to 4.4%, and 0.4 percentage point respectively. It should be noted, however, that the differences in NPV and IRR values can vary significantly, while their interpretation depends on the likelihood of implementation. Generalizing the obtained results, based on the average values, the maximum value of the risk premium in the given calculation conditions of the "X" deposit, and the correspondingly large datasets (greater than 2500), should not be higher than 2.4 percentage points. The impact of the analyzed geological parameters on the NPV and IRR depends primarily on their co-existence, which can be measured by the strength of correlation. In the analyzed case, the correlations result in limiting the range of variation of the geological parameters and economics results (the empirical copula reduces the NPV and IRR in probabilistic approach). However, this is due to the adjustment of the calculation under conditions similar to those prevailing in the deposit.

  9. Is Increased Susceptibility to Balkan Endemic Nephropathy in Carriers of Common GSTA1 (*A/*B) Polymorphism Linked with the Catalytic Role of GSTA1 in Ochratoxin A Biotransformation? Serbian Case Control Study and In Silico Analysis

    PubMed Central

    Reljic, Zorica; Zlatovic, Mario; Savic-Radojevic, Ana; Pekmezovic, Tatjana; Djukanovic, Ljubica; Matic, Marija; Pljesa-Ercegovac, Marija; Mimic-Oka, Jasmina; Opsenica, Dejan; Simic, Tatjana

    2014-01-01

    Although recent data suggest aristolochic acid as a putative cause of Balkan endemic nephropathy (BEN), evidence also exists in favor of ochratoxin A (OTA) exposure as risk factor for the disease. The potential role of xenobiotic metabolizing enzymes, such as the glutathione transferases (GSTs), in OTA biotransformation is based on OTA glutathione adducts (OTHQ-SG and OTB-SG) in blood and urine of BEN patients. We aimed to analyze the association between common GSTA1, GSTM1, GSTT1, and GSTP1 polymorphisms and BEN susceptibility, and thereafter performed an in silico simulation of particular GST enzymes potentially involved in OTA transformations. GSTA1, GSTM1, GSTT1 and GSTP1 genotypes were determined in 207 BEN patients and 138 non-BEN healthy individuals from endemic regions by polymerase chain reaction (PCR). Molecular modeling in silico was performed for GSTA1 protein. Among the GST polymorphisms tested, only GSTA1 was significantly associated with a higher risk of BEN. Namely, carriers of the GSTA1*B gene variant, associated with lower transcriptional activation, were at a 1.6-fold higher BEN risk than those carrying the homozygous GSTA1*A/*A genotype (OR = 1.6; p = 0.037). In in silico modeling, we found four structures, two OTB-SG and two OTHQ-SG, bound in a GSTA1 monomer. We found that GSTA1 polymorphism was associated with increased risk of BEN, and suggested, according to the in silico simulation, that GSTA1-1 might be involved in catalyzing the formation of OTHQ-SG and OTB-SG conjugates. PMID:25111321

  10. Skin penetration surrogate for the evaluation of less lethal kinetic energy munitions.

    PubMed

    Bir, Cynthia A; Resslar, Marianne; Stewart, Shelby

    2012-07-10

    Although the benefits of the use of less lethal kinetic energy munitions are numerous, there is a need to evaluate the munitions prior to deployment to ensure their intended effect. The objective of the current research was to validate a surrogate that could be used to predict the risk of penetration of these devices. Existing data from biomechanical testing with post-mortem human specimens (PMHS) served as the foundation for this research. Development of the surrogate involved simulating the various layers of the skin and underlying soft tissues using a combination of materials. A standardized 12-gauge impactor was used to assess each combination. The energy density that resulted in a 50% risk of penetration for the anterior thorax region (23.99 J/cm(2)) from the previous research was matched using a specific combination of layers. Twelve various combinations of materials were tested with the 50% risk of penetration determined. The final validated surrogate consisted of a Laceration Assessment Layer (LAL) of natural chamois and .6 cm of closed-cell foam over a Penetration Assessment Layer (PAL) of 20% ordnance gelatin. This surrogate predicted a 50% risk of penetration at 23.88 J/cm(2). Injury risk curves for the PMHS and surrogate development work are presented. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. A Risk-Based Approach for Aerothermal/TPS Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak

    2007-01-01

    The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.

  12. The Cumulative Lifting Index (CULI) for the Revised NIOSH Lifting Equation: Quantifying Risk for Workers With Job Rotation.

    PubMed

    Garg, Arun; Kapellusch, Jay M

    2016-08-01

    The objectives were to: (a) develop a continuous frequency multiplier (FM) for the Revised NIOSH Lifting Equation (RNLE) as a function of lifting frequency and duration of a lifting task, and (b) describe the Cumulative Lifting Index (CULI), a methodology for estimating physical exposure to workers with job rotation. The existing FM for the RNLE (FME) does not differentiate between task duration >2 hr and <8 hr, which makes quantifying physical exposure to workers with job rotation difficult and presents challenges to job designers. Using the existing FMs for 1, 2, and 8 hr of task durations, we developed a continuous FM (FMP) that extends to 12 hr per day. We simulated 157,500 jobs consisting of two tasks each and, using different combinations of Frequency Independent Lifting Index, lifting frequency and duration of lifting. Biomechanical stresses were estimated using the CULI, time-weighted average (TWA), and peak exposure. The median difference between FME and FMP was ±1% (range: 0%-15%). Compared to CULI, TWA underestimated risk of low-back pain (LBP) for 18% to 30% of jobs, and peak exposure for an assumed 8-hr work shift overestimated risk of LBP for 20% to 25% of jobs. Peak task exposure showed 90% agreement with CULI but ignored one of two tasks. The CULI partially addressed the underestimation of physical exposure using the TWA approach and overestimation of exposure using the peak-exposure approach. The proposed FM and CULI may provide more accurate physical exposure estimates, and therefore estimated risk of LBP, for workers with job rotation. © 2016, Human Factors and Ergonomics Society.

  13. FNCS: A Framework for Power System and Communication Networks Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciraci, Selim; Daily, Jeffrey A.; Fuller, Jason C.

    2014-04-13

    This paper describes the Fenix framework that uses a federated approach for integrating power grid and communication network simulators. Compared existing approaches, Fenix al- lows co-simulation of both transmission and distribution level power grid simulators with the communication network sim- ulator. To reduce the performance overhead of time synchro- nization, Fenix utilizes optimistic synchronization strategies that make speculative decisions about when the simulators are going to exchange messages. GridLAB-D (a distribution simulator), PowerFlow (a transmission simulator), and ns-3 (a telecommunication simulator) are integrated with the frame- work and are used to illustrate the enhanced performance pro- vided by speculative multi-threadingmore » on a smart grid applica- tion. Our speculative multi-threading approach achieved on average 20% improvement over the existing synchronization methods« less

  14. A test of maternal programming of offspring stress response to predation risk in threespine sticklebacks.

    PubMed

    Mommer, Brett C; Bell, Alison M

    2013-10-02

    Non-genetic maternal effects are widespread across taxa and challenge our traditional understanding of inheritance. Maternal experience with predators, for example, can have lifelong consequences for offspring traits, including fitness. Previous work in threespine sticklebacks showed that females exposed to simulated predation risk produced eggs with higher cortisol content and offspring with altered anti-predator behavior. However, it is unknown whether this maternal effect is mediated via the offspring glucocorticoid stress response and if it is retained over the entire lifetime of offspring. Therefore, we tested the hypothesis that maternal exposure to simulated predation risk has long-lasting effects on the cortisol response to simulated predation risk in stickleback offspring. We measured circulating concentrations of cortisol before (baseline), 15 min after, and 60 min after exposure to a simulated predation risk. We compared adult offspring of predator-exposed mothers and control mothers in two different social environments (alone or in a group). Relative to baseline, offspring plasma cortisol was highest 15 min after exposure to simulated predation risk and decreased after 60 min. Offspring of predator-exposed mothers differed in the cortisol response to simulated predation risk compared to offspring of control mothers. In general, females had higher cortisol than males, and fish in a group had lower cortisol than fish that were by themselves. The buffering effect of the social environment did not differ between maternal treatments or between males and females. Altogether the results show that while a mother's experience with simulated predation risk might affect the physiological response of her adult offspring to a predator, sex and social isolation have much larger effects on the stress response to predation risk in sticklebacks. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Three-dimensional long-period groundmotion simulations in the upper Mississippi embayment

    USGS Publications Warehouse

    Macpherson, K.A.; Woolery, E.W.; Wang, Z.; Liu, P.

    2010-01-01

    We employed a 3D velocity model and 3D wave propagation code to simulate long-period ground motions in the upper Mississippi embayment. This region is at risk from large earthquakes in the New Madrid seismic zone (NMSZ) and observational data are sparse, making simulation a valuable tool for predicting the effects of large events. We undertook these simulations to estimate the magnitude of shaking likely to occur and to investigate the influence of the 3D embayment structure and finite-fault mechanics on ground motions. There exist three primary fault zones in the NMSZ, each of which was likely associated with one of the main shocks of the 1811-12 earthquake triplet. For this study, three simulations have been conducted on each major segment, exploring the impact of different epicentral locations and rupture directions on ground motions. The full wave field up to a frequency of 0.5 Hz is computed on a 200 ?? 200 ?? 50-km 3 volume using a staggered-grid finite-difference code. Peak horizontal velocity and bracketed durations were calculated at the free surface. The NMSZ simulations indicate that for the considered bandwidth, finite-fault mechanics such as fault proximity, directivity effect, and slip distribution exert the most control on ground motions. The 3D geologic structure of the upper Mississippi embayment also influences ground motion with indications that amplification is induced by the sharp velocity contrast at the basin edge.

  16. Advancements of in-flight mass moment of inertia and structural deflection algorithms for satellite attitude simulators

    NASA Astrophysics Data System (ADS)

    Wright, Jonathan W.

    Experimental satellite attitude simulators have long been used to test and analyze control algorithms in order to drive down risk before implementation on an operational satellite. Ideally, the dynamic response of a terrestrial-based experimental satellite attitude simulator would be similar to that of an on-orbit satellite. Unfortunately, gravitational disturbance torques and poorly characterized moments of inertia introduce uncertainty into the system dynamics leading to questionable attitude control algorithm experimental results. This research consists of three distinct, but related contributions to the field of developing robust satellite attitude simulators. In the first part of this research, existing approaches to estimate mass moments and products of inertia are evaluated followed by a proposition and evaluation of a new approach that increases both the accuracy and precision of these estimates using typical on-board satellite sensors. Next, in order to better simulate the micro-torque environment of space, a new approach to mass balancing satellite attitude simulator is presented, experimentally evaluated, and verified. Finally, in the third area of research, we capitalize on the platform improvements to analyze a control moment gyroscope (CMG) singularity avoidance steering law. Several successful experiments were conducted with the CMG array at near-singular configurations. An evaluation process was implemented to verify that the platform remained near the desired test momentum, showing that the first two components of this research were effective in allowing us to conduct singularity avoidance experiments in a representative space-like test environment.

  17. Simulation of ultrasound propagation in bone

    NASA Astrophysics Data System (ADS)

    Kaufman, Jonathan J.; Luo, Gangming; Siffert, Robert S.

    2004-10-01

    Ultrasound has been proposed as a means to noninvasively assess bone and, particularly, bone strength and fracture risk, as for example in osteoporosis. Because strength is a function of both mineral density and architecture, ultrasound has the potential to provide more accurate measurement of bone integrity than, for example, with x-ray absorptiometric methods. Although some of this potential has already been realized-a number of clinical devices are presently available-there is still much that is unknown regarding the interaction of ultrasound with bone. Because of the inherent complexity of the propagation medium, few analytic solutions exist with practical application. For this reason, ultrasound simulation techniques have been developed and applied to a number of different problems of interest in ultrasonic bone assessment. Both 2D and 3D simulation results will be presented, including the effects of architecture and density on the received waveform, propagation effects of both cortical and trabecular bone, and the relative contributions of scattering and absorption to attenuation in trabecular bone. The results of these simulation studies should lead to improved understanding and ultimately to more effective clinical devices for ultrasound bone assessment. [This work was supported by The Carroll and Milton Petrie Foundation and by SBIR Grant No. 1R43RR16750 from the National Center for Research Resources of the NIH.

  18. Monitoring Soil Infiltration In Semi-Arid Regions With Meteosat And A Coupled Model Approach Using PROMET And SLC

    NASA Astrophysics Data System (ADS)

    Klug, P.; Bach, H.; Migdall, S.

    2013-12-01

    In arid regions the infiltration of sparse rainfalls and resulting ground water recharge is a critical quantity for the water cycle. With the PROMET model the infiltration process can be simulated in detail, since 4 soil layers together with the hourly calculation time step allow simulating the vertical water transport. Wet soils are darker than dry soils. Using the SLC reflectance model this effect can be simulated and compared to temporal high resolution time series of measured reflectances from Meteosat in order to monitor the drying process. This study demonstrates how MSG can be used to better parameterize the simulation of the infiltration process and reduce uncertainties in ground water recharge estimation. The study is carried out in the frame of the EU FP7 project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins). According to climate projections, Mediterranean countries are at risk of changes in the hydrological budget, the agricultural productivity and drinking water supply in the future. The CLIMB FP-7 project coordinated by the University of Munich (LMU) aims at employing integrated hydrological modelling in a new framework to reduce existing uncertainties in climate change impact analysis of the Mediterranean region [1, 2].

  19. A bootstrap based space-time surveillance model with an application to crime occurrences

    NASA Astrophysics Data System (ADS)

    Kim, Youngho; O'Kelly, Morton

    2008-06-01

    This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.

  20. Application of Arrester Simulation Device in Training

    NASA Astrophysics Data System (ADS)

    Baoquan, Zhang; Ziqi, Chai; Genghua, Liu; Wei, Gao; Kaiyue, Wu

    2017-12-01

    Combining with the arrester simulation device put into use successfully, this paper introduces the application of arrester test in the insulation resistance measurement, counter test, Leakage current test under DC 1mA voltage and leakage current test under 0.75U1mA. By comparing with the existing training, this paper summarizes the arrester simulation device’s outstanding advantages including real time monitoring, multi-type fault data analysis and acousto-optic simulation. It effectively solves the contradiction between authenticity and safety in the existing test training, and provides a reference for further training.

  1. Bias Due to Correlation Between Times-at-Risk for Infection in Epidemiologic Studies Measuring Biological Interactions Between Sexually Transmitted Infections: A Case Study Using Human Papillomavirus Type Interactions

    PubMed Central

    Malagón, Talía; Lemieux-Mellouki, Philippe; Laprise, Jean-François; Brisson, Marc

    2016-01-01

    The clustering of human papillomavirus (HPV) infections in some individuals is often interpreted as the result of common risk factors rather than biological interactions between different types of HPV. The intraindividual correlation between times-at-risk for all HPV infections is not generally considered in the analysis of epidemiologic studies. We used a deterministic transmission model to simulate cross-sectional and prospective epidemiologic studies measuring associations between 2 HPV types. When we assumed no interactions, the model predicted that studies would estimate odds ratios and incidence rate ratios greater than 1 between HPV types even after complete adjustment for sexual behavior. We demonstrated that this residual association is due to correlation between the times-at-risk for different HPV types, where individuals become concurrently at risk for all of their partners’ HPV types when they enter a partnership and are not at risk when they are single. This correlation can be controlled in prospective studies by restricting analyses to susceptible individuals with an infected sexual partner. The bias in the measured associations was largest in low-sexual-activity populations, cross-sectional studies, and studies which evaluated infection with a first HPV type as the exposure. These results suggest that current epidemiologic evidence does not preclude the existence of competitive biological interactions between HPV types. PMID:27927619

  2. Hydraulic and Condition Assessment of Existing Sewerage Network: A Case Study of an Educational Institute

    NASA Astrophysics Data System (ADS)

    Sourabh, Nishant; Timbadiya, P. V.

    2018-04-01

    The hydraulic simulation of the existing sewerage network provides various information about critical points to assess the deteriorating condition and help in rehabilitation of existing network and future expansion. In the present study, hydraulic and condition assessment of existing network of educational Institute (i.e. Sardar Vallabhbhai National Institute of Technology-Surat, Gujarat, India), having an area of 100 ha and ground levels in range of 5.0-9.0 m above mean sea level, has been carried out using sewage flow simulation for existing and future scenarios analysis using SewerGEMS v8i. The paper describes the features of 4.79 km long sewerage network of institute followed by network model simulation for aforesaid scenarios and recommendations on improvement of the existing network for future use. The total sewer loads for present and future scenarios are 1.67 million litres per day (MLD) and 3.62 MLD, considering the peak factor of 3 on the basis of population. The hydraulic simulation of the existing scenario indicated depth by diameter (d/D) ratio in the range of 0.02-0.48 and velocity range of 0.08-0.53 m/s for existing network for present scenario. For the future scenario, the existing network is needed to be modified and it was found that total of 11 conduits (length: 464.8 m) should be replaced to the next higher diameter available, i.e., 350 mm for utilization of existing network for future scenario. The present study provides the methodology for condition assessment of existing network and its utilization as per guidelines provided by Central Public Health and Environmental Engineering Organization, 2013. The methodology presented in this paper can be used by municipal/public health engineer for the assessment of existing sewerage network for its serviceability and improvement in future.

  3. An inexpensive, easily constructed, reusable task trainer for simulating ultrasound-guided pericardiocentesis.

    PubMed

    Zerth, Herb; Harwood, Robert; Tommaso, Laura; Girzadas, Daniel V

    2012-12-01

    Pericardiocentesis is a low-frequency, high-risk procedure integral to the practice of emergency medicine. Ultrasound-guided pericardiocentesis is the preferred technique for providing this critical intervention. Traditionally, emergency physicians learned pericardiocentesis in real time, at the bedside, on critically ill patients. Medical education is moving toward simulation for training and assessment of procedures such as pericardiocentesis because it allows learners to practice time-sensitive skills without risk to patient or learner. The retail market for models for pericardiocentesis practice is limited and expensive. We have developed an ultrasound-guided pericardiocentesis task trainer that allows the physician to insert a needle under ultrasound guidance, pierce the "pericardial sac" and aspirate "blood." Our model can be simply constructed in a home kitchen, and the overall preparation time is 1 h. Our model costs $20.00 (US, 2008). Materials needed for the construction include 16 ounces of plain gelatin, one large balloon, one golf ball, food coloring, non-stick cooking spray, one wooden cooking skewer, surgical iodine solution, and a 4-quart sized plastic food storage container. Refrigeration and a heat source for cooking are also required. Once prepared, the model is usable for 2 weeks at room temperature and may be preserved an additional week if refrigerated. When the model shows signs of wear, it can be easily remade, by simply recycling the existing materials. The self-made model was well liked by training staff due to accessibility of a simulation model, and by learners of the technique as they felt more at ease performing pericardiocentesis on a live patient. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. The bias in current measures of gestational weight gain

    PubMed Central

    Hutcheon, Jennifer A; Bodnar, Lisa M; Joseph, KS; Abrams, Barbara; Simhan, Hyagriv N; Platt, Robert W

    2014-01-01

    Summary Conventional measures of gestational weight gain (GWG), such as average rate of weight gain, are likely correlated with gestational duration. Such correlation could introduce bias to epidemiologic studies of GWG and adverse perinatal outcomes because many perinatal outcomes are also correlated with gestational duration. This study aimed to quantify the extent to which currently-used GWG measures may bias the apparent relation between maternal weight gain and risk of preterm birth. For each woman in a provincial perinatal database registry (British Columbia, Canada, 2000–2009), a total GWG was simulated such that it was uncorrelated with risk of preterm birth. The simulation was based on serial antenatal GWG measurements from a sample of term pregnancies. Simulated GWGs were classified using 3 approaches: total weight gain (kg), average rate of weight gain (kg/week) or adequacy of gestational weight gain in relation to Institute of Medicine recommendations, and their association with preterm birth ≤ 32 weeks was explored using logistic regression. All measures of GWG induced an apparent association between GWG and preterm birth ≤32 weeks even when, by design, none existed. Odds ratios in the lowest fifths of each GWG measure compared with the middle fifths ranged from 4.4 [95% CI 3.6, 5.4] (total weight gain) to 1.6 [95% CI 1.3, 2.0] (Institute of Medicine adequacy ratio). Conventional measures of GWG introduce serious bias to the study of maternal weight gain and preterm birth. A new measure of GWG that is uncorrelated with gestational duration is needed. PMID:22324496

  5. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  6. A simulation model for risk assessment of turbine wheels

    NASA Astrophysics Data System (ADS)

    Safie, Fayssal M.; Hage, Richard T.

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  7. Risk assessment predictions of open dumping area after closure using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Pauzi, Nur Irfah Mohd; Radhi, Mohd Shahril Mat; Omar, Husaini

    2017-10-01

    Currently, there are many abandoned open dumping areas that were left without any proper mitigation measures. These open dumping areas could pose serious hazard to human and pollute the environment. The objective of this paper is to determine the risk assessment at the open dumping area after they has been closed using Monte Carlo Simulation method. The risk assessment exercise is conducted at the Kuala Lumpur dumping area. The rapid urbanisation of Kuala Lumpur coupled with increase in population lead to increase in waste generation. It leads to more dumping/landfill area in Kuala Lumpur. The first stage of this study involve the assessment of the dumping area and samples collections. It followed by measurement of settlement of dumping area using oedometer. The risk of the settlement is predicted using Monte Carlo simulation method. Monte Carlo simulation calculates the risk and the long-term settlement. The model simulation result shows that risk level of the Kuala Lumpur open dumping area ranges between Level III to Level IV i.e. between medium risk to high risk. These settlement (ΔH) is between 3 meters to 7 meters. Since the risk is between medium to high, it requires mitigation measures such as replacing the top waste soil with new sandy gravel soil. This will increase the strength of the soil and reduce the settlement.

  8. Glacial Lake Outburst Flood Risk in Himachal Pradesh, India: An Integrative and Anticipatory Approach to Inform Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Allen, Simon; Linsbauer, Andreas; Huggel, Christian; Singh Randhawa, Surjeet

    2016-04-01

    Most research concerning the hazard from glacial lake outburst floods (GLOFs) has focused on the threat from lakes that have formed over the past century, and which continue to expand rapidly in response to recent warming of the climate system. However, attention is shifting towards the anticipation of future hazard and risk associated with new lakes that will develop as glaciers continue to retreat and dramatically different landscapes are uncovered. Nowhere will this threat be more pronounced than in the Himalaya, where the majority of the world's glaciers are found, and where the dynamics of nature interact closely with livelihoods and anthropogenic resources. Using the Indian Himalayan state of Himachal Pradesh (HP) as a case study, we combine a suite of GIS-based approaches to: 1)Implement a large-scale automated GLOF risk assessment within an integrative climate risk framework that recognizes both physical and socio-economic determining factors. 2)Expand the assessment beyond the current situation, to provide early anticipation of emerging GLOF hazard as new lakes form in response to further retreat of the Himalayan glaciers. Results clearly demonstrate a significant future increase in relative GLOF hazard levels across most Thesils of HP (administrative units), as the overall potential for GLOFs being triggered from mass movement of ice and rock avalanches increases, and as new GLOF paths affect additional land areas. Across most Thesils, the simulated increase in GLOF frequency is an order of magnitude larger than the simulated increase in GLOF affected area, as paths from newly formed glacial lakes generally tend to converge downstream within existing flood channels. In the Thesil of Kullu for example, we demonstrate a 7-fold increase in the probability of GLOF occurrence, and a 3-fold increase in the area affected by potential GLOF paths. In those situations where potential GLOFs from new lakes will flow primarily along existing flood paths, any adaptation measures implemented now will offer dual benefits - reducing not only the current GLOF risk, but also responding to the emerging risk anticipated for the coming decades. Such adaptation strategies (e.g. early warning systems, community preparedness, disaster response planning and land zoning) can be considered "low-regret" measures, i.e, responses that offer immediate benefits to the communities now while also offering benefits over a range of possible future scenarios. Conversely in locations where the formation of new lakes over the coming decades will create an entirely new threat, local authorities would be encouraged to consider long time scales in their climate adaptation planning. This is particularly relevant for new infrastructural developments (residential property, road, hydropower dams etc) where new threats could clearly emerge during the intended lifetime of any constructions.

  9. Hydrological risks in anthropized watersheds: modeling of hazard, vulnerability and impacts on population from south-west of Madagascar

    NASA Astrophysics Data System (ADS)

    Mamy Rakotoarisoa, Mahefa; Fleurant, Cyril; Taibi, Nuscia; Razakamanana, Théodore

    2016-04-01

    Hydrological risks, especially for floods, are recurrent on the Fiherenana watershed - southwest of Madagascar. The city of Toliara, which is located at the outlet of the river basin, is subjected each year to hurricane hazards and floods. The stakes are of major importance in this part of the island. This study begins with the analysis of hazard by collecting all existing hydro-climatic data on the catchment. It then seeks to determine trends, despite the significant lack of data, using simple statistical models (decomposition of time series). Then, two approaches are conducted to assess the vulnerability of the city of Toliara and the surrounding villages. First, a static approach, from surveys of land and the use of GIS are used. Then, the second method is the use of a multi-agent-based simulation model. The first step is the mapping of a vulnerability index which is the arrangement of several static criteria. This is a microscale indicator (the scale used is the housing). For each House, there are several criteria of vulnerability, which are the potential water depth, the flow rate, or the architectural typology of the buildings. For the second part, simulations involving scenes of agents are used in order to evaluate the degree of vulnerability of homes from flooding. Agents are individual entities to which we can assign behaviours on purpose to simulate a given phenomenon. The aim is not to give a criterion to the house as physical building, such as its architectural typology or its strength. The model wants to know the chances of the occupants of the house to escape from a catastrophic flood. For this purpose, we compare various settings and scenarios. Some scenarios are conducted to take into account the effect of certain decision made by the responsible entities (Information and awareness of the villagers for example). The simulation consists of two essential parts taking place simultaneously in time: simulation of the rise of water and the flow using classical hydrological functions and multi agent system (transfer function and production function) and the simulation of the behaviour of the people facing the arrival of hazard.

  10. The effect of improving task representativeness on capturing nurses’ risk assessment judgements: a comparison of written case simulations and physical simulations

    PubMed Central

    2013-01-01

    Background The validity of studies describing clinicians’ judgements based on their responses to paper cases is questionable, because - commonly used - paper case simulations only partly reflect real clinical environments. In this study we test whether paper case simulations evoke similar risk assessment judgements to the more realistic simulated patients used in high fidelity physical simulations. Methods 97 nurses (34 experienced nurses and 63 student nurses) made dichotomous assessments of risk of acute deterioration on the same 25 simulated scenarios in both paper case and physical simulation settings. Scenarios were generated from real patient cases. Measures of judgement ‘ecology’ were derived from the same case records. The relationship between nurses’ judgements, actual patient outcomes (i.e. ecological criteria), and patient characteristics were described using the methodology of judgement analysis. Logistic regression models were constructed to calculate Lens Model Equation parameters. Parameters were then compared between the modeled paper-case and physical-simulation judgements. Results Participants had significantly less achievement (ra) judging physical simulations than when judging paper cases. They used less modelable knowledge (G) with physical simulations than with paper cases, while retaining similar cognitive control and consistency on repeated patients. Respiration rate, the most important cue for predicting patient risk in the ecological model, was weighted most heavily by participants. Conclusions To the extent that accuracy in judgement analysis studies is a function of task representativeness, improving task representativeness via high fidelity physical simulations resulted in lower judgement performance in risk assessments amongst nurses when compared to paper case simulations. Lens Model statistics could prove useful when comparing different options for the design of simulations used in clinical judgement analysis. The approach outlined may be of value to those designing and evaluating clinical simulations as part of education and training strategies aimed at improving clinical judgement and reasoning. PMID:23718556

  11. Business intelligence modeling in launch operations

    NASA Astrophysics Data System (ADS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-05-01

    The future of business intelligence in space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems. This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations, process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations, and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems.

  12. Business Intelligence Modeling in Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-01-01

    This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation .based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations. process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations. and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems. The future of business intelligence of space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems.

  13. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  14. Development of a Geant4 application to characterise a prototype neutron detector based on three orthogonal 3He tubes inside an HDPE sphere.

    PubMed

    Gracanin, V; Guatelli, S; Prokopovich, D; Rosenfeld, A B; Berry, A

    2017-01-01

    The Bonner Sphere Spectrometer (BSS) system is a well-established technique for neutron dosimetry that involves detection of thermal neutrons within a range of hydrogenous moderators. BSS detectors are often used to perform neutron field surveys in order to determine the ambient dose equivalent H*(10) and estimate health risk to personnel. There is a potential limitation of existing neutron survey techniques, since some detectors do not consider the direction of the neutron field, which can result in overly conservative estimates of dose in neutron fields. This paper shows the development of a Geant4 simulation application to characterise a prototype neutron detector based on three orthogonal 3 He tubes inside a single HDPE sphere built at the Australian Nuclear Science and Technology Organisation (ANSTO). The Geant4 simulation has been validated with respect to experimental measurements performed with an Am-Be source. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  15. The ability of individuals to assess population density influences the evolution of emigration propensity and dispersal distance.

    PubMed

    Poethke, Hans Joachim; Gros, Andreas; Hovestadt, Thomas

    2011-08-07

    We analyze the simultaneous evolution of emigration and settlement decisions for actively dispersing species differing in their ability to assess population density. Using an individual-based model we simulate dispersal as a multi-step (patch to patch) movement in a world consisting of habitat patches surrounded by a hostile matrix. Each such step is associated with the same mortality risk. Our simulations show that individuals following an informed strategy, where emigration (and settlement) probability depends on local population density, evolve a lower (natal) emigration propensity but disperse over significantly larger distances - i.e. postpone settlement longer - than individuals performing density-independent emigration. This holds especially when variation in environmental conditions is spatially correlated. Both effects can be traced to the informed individuals' ability to better exploit existing heterogeneity in reproductive chances. Yet, already moderate distance-dependent dispersal costs prevent the evolution of multi-step (long-distance) dispersal, irrespective of the dispersal strategy. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Seismic analysis of offshore wind turbines on bottom-fixed support structures.

    PubMed

    Alati, Natale; Failla, Giuseppe; Arena, Felice

    2015-02-28

    This study investigates the seismic response of a horizontal axis wind turbine on two bottom-fixed support structures for transitional water depths (30-60 m), a tripod and a jacket, both resting on pile foundations. Fully coupled, nonlinear time-domain simulations on full system models are carried out under combined wind-wave-earthquake loadings, for different load cases, considering fixed and flexible foundation models. It is shown that earthquake loading may cause a significant increase of stress resultant demands, even for moderate peak ground accelerations, and that fully coupled nonlinear time-domain simulations on full system models are essential to capture relevant information on the moment demand in the rotor blades, which cannot be predicted by analyses on simplified models allowed by existing standards. A comparison with some typical design load cases substantiates the need for an accurate seismic assessment in sites at risk from earthquakes. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  17. Product component genealogy modeling and field-failure prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Caleb; Hong, Yili; Meeker, William Q.

    Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life-cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can bemore » achieved in predicting time to failure, thus yielding more accurate field-failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures.« less

  18. Product component genealogy modeling and field-failure prediction

    DOE PAGES

    King, Caleb; Hong, Yili; Meeker, William Q.

    2016-04-13

    Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life-cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can bemore » achieved in predicting time to failure, thus yielding more accurate field-failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures.« less

  19. Enterprise Networks for Competences Exchange: A Simulation Model

    NASA Astrophysics Data System (ADS)

    Remondino, Marco; Pironti, Marco; Pisano, Paola

    A business process is a set of logically related tasks performed to achieve a defined business and related to improving organizational processes. Process innovation can happen at various levels: incrementally, redesign of existing processes, new processes. The knowledge behind process innovation can be shared, acquired, changed and increased by the enterprises inside a network. An enterprise can decide to exploit innovative processes it owns, thus potentially gaining competitive advantage, but risking, in turn, that other players could reach the same technological levels. Or it could decide to share it, in exchange for other competencies or money. These activities could be the basis for a network formation and/or impact the topology of an existing network. In this work an agent based model is introduced (E3), aiming to explore how a process innovation can facilitate network formation, affect its topology, induce new players to enter the market and spread onto the network by being shared or developed by new players.

  20. Simulating the influence of snow surface processes on soil moisture dynamics and streamflow generation in an alpine catchment

    NASA Astrophysics Data System (ADS)

    Wever, Nander; Comola, Francesco; Bavay, Mathias; Lehning, Michael

    2017-08-01

    The assessment of flood risks in alpine, snow-covered catchments requires an understanding of the linkage between the snow cover, soil and discharge in the stream network. Here, we apply the comprehensive, distributed model Alpine3D to investigate the role of soil moisture in the predisposition of the Dischma catchment in Switzerland to high flows from rainfall and snowmelt. The recently updated soil module of the physics-based multilayer snow cover model SNOWPACK, which solves the surface energy and mass balance in Alpine3D, is verified against soil moisture measurements at seven sites and various depths inside and in close proximity to the Dischma catchment. Measurements and simulations in such terrain are difficult and consequently, soil moisture was simulated with varying degrees of success. Differences between simulated and measured soil moisture mainly arise from an overestimation of soil freezing and an absence of a groundwater description in the Alpine3D model. Both were found to have an influence in the soil moisture measurements. Using the Alpine3D simulation as the surface scheme for a spatially explicit hydrologic response model using a travel time distribution approach for interflow and baseflow, streamflow simulations were performed for the discharge from the catchment. The streamflow simulations provided a closer agreement with observed streamflow when driving the hydrologic response model with soil water fluxes at 30 cm depth in the Alpine3D model. Performance decreased when using the 2 cm soil water flux, thereby mostly ignoring soil processes. This illustrates that the role of soil moisture is important to take into account when understanding the relationship between both snowpack runoff and rainfall and catchment discharge in high alpine terrain. However, using the soil water flux at 60 cm depth to drive the hydrologic response model also decreased its performance, indicating that an optimal soil depth to include in surface simulations exists and that the runoff dynamics are controlled by only a shallow soil layer. Runoff coefficients (i.e. ratio of rainfall over discharge) based on measurements for high rainfall and snowmelt events were found to be dependent on the simulated initial soil moisture state at the onset of an event, further illustrating the important role of soil moisture for the hydrological processes in the catchment. The runoff coefficients using simulated discharge were found to reproduce this dependency, which shows that the Alpine3D model framework can be successfully applied to assess the predisposition of the catchment to flood risks from both snowmelt and rainfall events.

  1. High-Resolution WRF Forecasts of Lightning Threat

    NASA Technical Reports Server (NTRS)

    Goodman, S. J.; McCaul, E. W., Jr.; LaCasse, K.

    2007-01-01

    Tropical Rainfall Measuring Mission (TRMM)lightning and precipitation observations have confirmed the existence of a robust relationship between lightning flash rates and the amount of large precipitating ice hydrometeors in storms. This relationship is exploited, in conjunction with the capabilities of the Weather Research and Forecast (WRF) model, to forecast the threat of lightning from convective storms using the output fields from the model forecasts. The simulated vertical flux of graupel at -15C is used in this study as a proxy for charge separation processes and their associated lightning risk. Initial experiments using 6-h simulations are conducted for a number of case studies for which three-dimensional lightning validation data from the North Alabama Lightning Mapping Array are available. The WRF has been initialized on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity and reflectivity fields, and METAR and ACARS data. An array of subjective and objective statistical metrics is employed to document the utility of the WRF forecasts. The simulation results are also compared to other more traditional means of forecasting convective storms, such as those based on inspection of the convective available potential energy field.

  2. Bivariate discrete beta Kernel graduation of mortality data.

    PubMed

    Mazza, Angelo; Punzo, Antonio

    2015-07-01

    Various parametric/nonparametric techniques have been proposed in literature to graduate mortality data as a function of age. Nonparametric approaches, as for example kernel smoothing regression, are often preferred because they do not assume any particular mortality law. Among the existing kernel smoothing approaches, the recently proposed (univariate) discrete beta kernel smoother has been shown to provide some benefits. Bivariate graduation, over age and calendar years or durations, is common practice in demography and actuarial sciences. In this paper, we generalize the discrete beta kernel smoother to the bivariate case, and we introduce an adaptive bandwidth variant that may provide additional benefits when data on exposures to the risk of death are available; furthermore, we outline a cross-validation procedure for bandwidths selection. Using simulations studies, we compare the bivariate approach proposed here with its corresponding univariate formulation and with two popular nonparametric bivariate graduation techniques, based on Epanechnikov kernels and on P-splines. To make simulations realistic, a bivariate dataset, based on probabilities of dying recorded for the US males, is used. Simulations have confirmed the gain in performance of the new bivariate approach with respect to both the univariate and the bivariate competitors.

  3. Computational Planning in Facial Surgery.

    PubMed

    Zachow, Stefan

    2015-10-01

    This article reflects the research of the last two decades in computational planning for cranio-maxillofacial surgery. Model-guided and computer-assisted surgery planning has tremendously developed due to ever increasing computational capabilities. Simulators for education, planning, and training of surgery are often compared with flight simulators, where maneuvers are also trained to reduce a possible risk of failure. Meanwhile, digital patient models can be derived from medical image data with astonishing accuracy and thus can serve for model surgery to derive a surgical template model that represents the envisaged result. Computerized surgical planning approaches, however, are often still explorative, meaning that a surgeon tries to find a therapeutic concept based on his or her expertise using computational tools that are mimicking real procedures. Future perspectives of an improved computerized planning may be that surgical objectives will be generated algorithmically by employing mathematical modeling, simulation, and optimization techniques. Planning systems thus act as intelligent decision support systems. However, surgeons can still use the existing tools to vary the proposed approach, but they mainly focus on how to transfer objectives into reality. Such a development may result in a paradigm shift for future surgery planning. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  4. Improving the home health acute-care hospitalization quality measure.

    PubMed

    Schade, Charles P; Brehm, John G

    2010-06-01

    (1) To demonstrate average length of service (ALOS) bias in the currently used acute-care hospitalization (ACH) home health quality measure, limiting comparability across agencies, and (2) to propose alternative ACH measures. Secondary analysis of Medicare home health service data 2004-2007; convenience sample of Medicare fee-for-service hospital discharges. Cross-sectional analysis and patient-level simulation. We aggregated outcome and ALOS data from 2,347 larger Medicare-certified home health agencies (HHAs) in the United States between 2004 and 2007, and calculated risk-adjusted monthly ACH rates. We used multiple regression to identify agency characteristics associated with ACH. We simulated ACH during and immediately after home health care using patient and agency characteristics similar to those in the actual data, comparing the existing measure with alternative fixed-interval measures. Of agency characteristics studied, ALOS had by far the highest partial correlation with the current ACH measure (r(2)=0.218, p<.0001). We replicated the correlation between ACH and ALOS in the patient-level simulation. We found no correlation between ALOS and the alternative measures. Alternative measures do not exhibit ALOS bias and would be appropriate for comparing HHA ACH rates with one another or over time.

  5. Risk of introducing exotic fruit flies, Ceratitis capitata, Ceratitis cosyra, and Ceratitis rosa (Diptera: Tephritidae), into southern China.

    PubMed

    Li, Baini; Ma, Jun; Hu, Xuenan; Liu, Haijun; Wu, Jiajiao; Chen, Hongjun; Zhang, Runjie

    2010-08-01

    Exotic fruit flies (Ceratitis spp.) are often serious agricultural pests. Here, we used, pathway analysis and Monte Carlo simulations to assess the risk of introduction of Ceratitis capitata (Wiedemann), Ceratitis cosyra (Walker), and Ceratitis rosa Karsch, into southern China with fruit consignments and incoming travelers. Historical data, expert opinions, relevant literature, and archives were used to set appropriate parameters in the pathway analysis. Based on the ongoing quarantine/ inspection strategies of China, as well as the interception records, we estimated the annual number of each fruit fly species entering Guangdong province undetected with commercially imported fruit, and the associated risk. We also estimated the gross number of pests arriving at Guangdong ports with incoming travelers and the associated risk. Sensitivity analysis also was performed to test the impact of parameter changes and to assess how the risk could be reduced. Results showed that the risk of introduction of the three fruit fly species into southern China with fruit consignments, which are mostly transported by ship, exists but is relatively low. In contrast, the risk of introduction with incoming travelers is high and hence deserves intensive attention. Sensitivity analysis indicated that either ensuring all shipments meet current phytosanitary requirements or increasing the proportion of fruit imports sampled for inspection could substantially reduce the risk associated with commercial imports. Sensitivity analysis also provided justification for banning importation of fresh fruit by international travelers. Thus, inspection and quarantine in conjunction with intensive detection were important mitigation measures to reduce the risk of Ceratitis spp. introduced into China.

  6. Human In Silico Drug Trials Demonstrate Higher Accuracy than Animal Models in Predicting Clinical Pro-Arrhythmic Cardiotoxicity.

    PubMed

    Passini, Elisa; Britton, Oliver J; Lu, Hua Rong; Rohrbacher, Jutta; Hermans, An N; Gallacher, David J; Greig, Robert J H; Bueno-Orovio, Alfonso; Rodriguez, Blanca

    2017-01-01

    Early prediction of cardiotoxicity is critical for drug development. Current animal models raise ethical and translational questions, and have limited accuracy in clinical risk prediction. Human-based computer models constitute a fast, cheap and potentially effective alternative to experimental assays, also facilitating translation to human. Key challenges include consideration of inter-cellular variability in drug responses and integration of computational and experimental methods in safety pharmacology. Our aim is to evaluate the ability of in silico drug trials in populations of human action potential (AP) models to predict clinical risk of drug-induced arrhythmias based on ion channel information, and to compare simulation results against experimental assays commonly used for drug testing. A control population of 1,213 human ventricular AP models in agreement with experimental recordings was constructed. In silico drug trials were performed for 62 reference compounds at multiple concentrations, using pore-block drug models (IC 50 /Hill coefficient). Drug-induced changes in AP biomarkers were quantified, together with occurrence of repolarization/depolarization abnormalities. Simulation results were used to predict clinical risk based on reports of Torsade de Pointes arrhythmias, and further evaluated in a subset of compounds through comparison with electrocardiograms from rabbit wedge preparations and Ca 2+ -transient recordings in human induced pluripotent stem cell-derived cardiomyocytes (hiPS-CMs). Drug-induced changes in silico vary in magnitude depending on the specific ionic profile of each model in the population, thus allowing to identify cell sub-populations at higher risk of developing abnormal AP phenotypes. Models with low repolarization reserve (increased Ca 2+ /late Na + currents and Na + /Ca 2+ -exchanger, reduced Na + /K + -pump) are highly vulnerable to drug-induced repolarization abnormalities, while those with reduced inward current density (fast/late Na + and Ca 2+ currents) exhibit high susceptibility to depolarization abnormalities. Repolarization abnormalities in silico predict clinical risk for all compounds with 89% accuracy. Drug-induced changes in biomarkers are in overall agreement across different assays: in silico AP duration changes reflect the ones observed in rabbit QT interval and hiPS-CMs Ca 2+ -transient, and simulated upstroke velocity captures variations in rabbit QRS complex. Our results demonstrate that human in silico drug trials constitute a powerful methodology for prediction of clinical pro-arrhythmic cardiotoxicity, ready for integration in the existing drug safety assessment pipelines.

  7. Assessing urban potential flooding risk and identifying effective risk-reduction measures.

    PubMed

    Cherqui, Frédéric; Belmeziti, Ali; Granger, Damien; Sourdril, Antoine; Le Gauffre, Pascal

    2015-05-01

    Flood protection is one of the traditional functions of any drainage system, and it remains a major issue in many cities because of economic and health impact. Heavy rain flooding has been well studied and existing simulation software can be used to predict and improve level of protection. However, simulating minor flooding remains highly complex, due to the numerous possible causes related to operational deficiencies or negligent behaviour. According to the literature, causes of blockages vary widely from one case to another: it is impossible to provide utility managers with effective recommendations on how to improve the level of protection. It is therefore vital to analyse each context in order to define an appropriate strategy. Here we propose a method to represent and assess the flooding risk, using GIS and data gathered during operation and maintenance. Our method also identifies potential management responses. The approach proposed aims to provide decision makers with clear and comprehensible information. Our method has been successfully applied to the Urban Community of Bordeaux (France) on 4895 interventions related to flooding recorded during the 2009-2011 period. Results have shown the relative importance of different issues, such as human behaviour (grease, etc.) or operational deficiencies (roots, etc.), and lead to identify corrective and proactive. This study also confirms that blockages are not always directly due to the network itself and its deterioration. Many causes depend on environmental and operating conditions on the network and often require collaboration between municipal departments in charge of roads, green spaces, etc. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. SINERGIA laparoscopic virtual reality simulator: didactic design and technical development.

    PubMed

    Lamata, Pablo; Gómez, Enrique J; Sánchez-Margallo, Francisco M; López, Oscar; Monserrat, Carlos; García, Verónica; Alberola, Carlos; Florido, Miguel Angel Rodríguez; Ruiz, Juan; Usón, Jesús

    2007-03-01

    VR laparoscopic simulators have demonstrated its validity in recent studies, and research should be directed towards a high training effectiveness and efficacy. In this direction, an insight into simulators' didactic design and technical development is provided, by describing the methodology followed in the building of the SINERGIA simulator. It departs from a clear analysis of training needs driven by a surgical training curriculum. Existing solutions and validation studies are an important reference for the definition of specifications, which are described with a suitable use of simulation technologies. Five new didactic exercises are proposed to train some of the basic laparoscopic skills. Simulator construction has required existing algorithms and the development of a particle-based biomechanical model, called PARSYS, and a collision handling solution based in a multi-point strategy. The resulting VR laparoscopic simulator includes new exercises and enhanced simulation technologies, and is finding a very good acceptance among surgeons.

  9. Validation of Risk Assessment Models of Venous Thromboembolism in Hospitalized Medical Patients.

    PubMed

    Greene, M Todd; Spyropoulos, Alex C; Chopra, Vineet; Grant, Paul J; Kaatz, Scott; Bernstein, Steven J; Flanders, Scott A

    2016-09-01

    Patients hospitalized for acute medical illness are at increased risk for venous thromboembolism. Although risk assessment is recommended and several at-admission risk assessment models have been developed, these have not been adequately derived or externally validated. Therefore, an optimal approach to evaluate venous thromboembolism risk in medical patients is not known. We conducted an external validation study of existing venous thromboembolism risk assessment models using data collected on 63,548 hospitalized medical patients as part of the Michigan Hospital Medicine Safety (HMS) Consortium. For each patient, cumulative venous thromboembolism risk scores and risk categories were calculated. Cox regression models were used to quantify the association between venous thromboembolism events and assigned risk categories. Model discrimination was assessed using Harrell's C-index. Venous thromboembolism incidence in hospitalized medical patients is low (1%). Although existing risk assessment models demonstrate good calibration (hazard ratios for "at-risk" range 2.97-3.59), model discrimination is generally poor for all risk assessment models (C-index range 0.58-0.64). The performance of several existing risk assessment models for predicting venous thromboembolism among acutely ill, hospitalized medical patients at admission is limited. Given the low venous thromboembolism incidence in this nonsurgical patient population, careful consideration of how best to utilize existing venous thromboembolism risk assessment models is necessary, and further development and validation of novel venous thromboembolism risk assessment models for this patient population may be warranted. Published by Elsevier Inc.

  10. Monte Carlo simulation of single accident airport risk profile

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A computer simulation model was developed for estimating the potential economic impacts of a carbon fiber release upon facilities within an 80 kilometer radius of a major airport. The model simulated the possible range of release conditions and the resulting dispersion of the carbon fibers. Each iteration of the model generated a specific release scenario, which would cause a specific amount of dollar loss to the surrounding community. By repeated iterations, a risk profile was generated, showing the probability distribution of losses from one accident. Using accident probability estimates, the risks profile for annual losses was derived. The mechanics are described of the simulation model, the required input data, and the risk profiles generated for the 26 large hub airports.

  11. Facilitating hydrological data analysis workflows in R: the RHydro package

    NASA Astrophysics Data System (ADS)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges of the RHydro package, including integration with big data technologies, web technologies, and emerging data models in hydrology.

  12. An Entropy Approach to Disclosure Risk Assessment: Lessons from Real Applications and Simulated Domains

    PubMed Central

    Airoldi, Edoardo M.; Bai, Xue; Malin, Bradley A.

    2011-01-01

    We live in an increasingly mobile world, which leads to the duplication of information across domains. Though organizations attempt to obscure the identities of their constituents when sharing information for worthwhile purposes, such as basic research, the uncoordinated nature of such environment can lead to privacy vulnerabilities. For instance, disparate healthcare providers can collect information on the same patient. Federal policy requires that such providers share “de-identified” sensitive data, such as biomedical (e.g., clinical and genomic) records. But at the same time, such providers can share identified information, devoid of sensitive biomedical data, for administrative functions. On a provider-by-provider basis, the biomedical and identified records appear unrelated, however, links can be established when multiple providers’ databases are studied jointly. The problem, known as trail disclosure, is a generalized phenomenon and occurs because an individual’s location access pattern can be matched across the shared databases. Due to technical and legal constraints, it is often difficult to coordinate between providers and thus it is critical to assess the disclosure risk in distributed environments, so that we can develop techniques to mitigate such risks. Research on privacy protection has so far focused on developing technologies to suppress or encrypt identifiers associated with sensitive information. There is growing body of work on the formal assessment of the disclosure risk of database entries in publicly shared databases, but a less attention has been paid to the distributed setting. In this research, we review the trail disclosure problem in several domains with known vulnerabilities and show that disclosure risk is influenced by the distribution of how people visit service providers. Based on empirical evidence, we propose an entropy metric for assessing such risk in shared databases prior to their release. This metric assesses risk by leveraging the statistical characteristics of a visit distribution, as opposed to person-level data. It is computationally efficient and superior to existing risk assessment methods, which rely on ad hoc assessment that are often computationally expensive and unreliable. We evaluate our approach on a range of location access patterns in simulated environments. Our results demonstrate the approach is effective at estimating trail disclosure risks and the amount of self-information contained in a distributed system is one of the main driving factors. PMID:21647242

  13. Risk of portfolio with simulated returns based on copula model

    NASA Astrophysics Data System (ADS)

    Razak, Ruzanna Ab; Ismail, Noriszura

    2015-02-01

    The commonly used tool for measuring risk of a portfolio with equally weighted stocks is variance-covariance method. Under extreme circumstances, this method leads to significant underestimation of actual risk due to its multivariate normality assumption of the joint distribution of stocks. The purpose of this research is to compare the actual risk of portfolio with the simulated risk of portfolio in which the joint distribution of two return series is predetermined. The data used is daily stock prices from the ASEAN market for the period January 2000 to December 2012. The copula approach is applied to capture the time varying dependence among the return series. The results shows that the chosen copula families are not suitable to present the dependence structures of each bivariate returns. Exception for the Philippines-Thailand pair where by t copula distribution appears to be the appropriate choice to depict its dependence. Assuming that the t copula distribution is the joint distribution of each paired series, simulated returns is generated and value-at-risk (VaR) is then applied to evaluate the risk of each portfolio consisting of two simulated return series. The VaR estimates was found to be symmetrical due to the simulation of returns via elliptical copula-GARCH approach. By comparison, it is found that the actual risks are underestimated for all pairs of portfolios except for Philippines-Thailand. This study was able to show that disregard of the non-normal dependence structure of two series will result underestimation of actual risk of the portfolio.

  14. Assessment and improvement of biotransfer models to cow's milk and beef used in exposure assessment tools for organic pollutants.

    PubMed

    Takaki, Koki; Wade, Andrew J; Collins, Chris D

    2015-11-01

    The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow's milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants. Copyright © 2015. Published by Elsevier Ltd.

  15. Traffic Flow Density Distribution Based on FEM

    NASA Astrophysics Data System (ADS)

    Ma, Jing; Cui, Jianming

    In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.

  16. Exploration of Force Transition in Stability Operations Using Multi-Agent Simulation

    DTIC Science & Technology

    2006-09-01

    risk, mission failure risk, and time in the context of the operational threat environment. The Pythagoras Multi-Agent Simulation and Data Farming...NUMBER OF PAGES 173 14. SUBJECT TERMS Stability Operations, Peace Operations, Data Farming, Pythagoras , Agent- Based Model, Multi-Agent Simulation...the operational threat environment. The Pythagoras Multi-Agent Simulation and Data Farming techniques are used to investigate force-level

  17. 75 FR 68392 - Self-Regulatory Organizations; The Options Clearing Corporation; Order Approving Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-05

    ... Numerical Simulations Risk Management Methodology November 1, 2010. I. Introduction On August 25, 2010, The... Analysis and Numerical Simulations (``STANS'') risk management methodology. The rule change alters... collateral within the STANS Monte Carlo simulations.\\7\\ \\7\\ OCC believes the approach currently used to...

  18. AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, D.; Alfonsi, A.; Talbot, P.

    2016-10-01

    The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less

  19. Use of High-Resolution WRF Simulations to Forecast Lightning Threat

    NASA Technical Reports Server (NTRS)

    McCaul, E. W., Jr.; LaCasse, K.; Goodman, S. J.; Cecil, D. J.

    2008-01-01

    Recent observational studies have confirmed the existence of a robust statistical relationship between lightning flash rates and the amount of large precipitating ice hydrometeors aloft in storms. This relationship is exploited, in conjunction with the capabilities of cloud-resolving forecast models such as WRF, to forecast explicitly the threat of lightning from convective storms using selected output fields from the model forecasts. The simulated vertical flux of graupel at -15C and the shape of the simulated reflectivity profile are tested in this study as proxies for charge separation processes and their associated lightning risk. Our lightning forecast method differs from others in that it is entirely based on high-resolution simulation output, without reliance on any climatological data. short [6-8 h) simulations are conducted for a number of case studies for which three-dmmensional lightning validation data from the North Alabama Lightning Mapping Array are available. Experiments indicate that initialization of the WRF model on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity fields, and METAR and ACARS data y&eld satisfactory simulations. __nalyses of the lightning threat fields suggests that both the graupel flux and reflectivity profile approaches, when properly calibrated, can yield reasonable lightning threat forecasts, although an ensemble approach is probably desirable in order to reduce the tendency for misplacement of modeled storms to hurt the accuracy of the forecasts. Our lightning threat forecasts are also compared to other more traditional means of forecasting thunderstorms, such as those based on inspection of the convective available potential energy field.

  20. Low Cost Simulator for Heart Surgery Training

    PubMed Central

    Silva, Roberto Rocha e; Lourenção, Artur; Goncharov, Maxim; Jatene, Fabio B.

    2016-01-01

    Objective Introduce the low-cost and easy to purchase simulator without biological material so that any institution may promote extensive cardiovascular surgery training both in a hospital setting and at home without large budgets. Methods A transparent plastic box is placed in a wooden frame, which is held by the edges using elastic bands, with the bottom turned upwards, where an oval opening is made, "simulating" a thoracotomy. For basic exercises in the aorta, the model presented by our service in the 2015 Brazilian Congress of Cardiovascular Surgery: a silicone ice tray, where one can train to make aortic purse-string suture, aortotomy, aortorrhaphy and proximal and distal anastomoses. Simulators for the training of valve replacement and valvoplasty, atrial septal defect repair and aortic diseases were added. These simulators are based on sewage pipes obtained in construction material stores and the silicone trays and ethyl vinyl acetate tissue were obtained in utility stores, all of them at a very low cost. Results The models were manufactured using inert materials easily found in regular stores and do not present contamination risk. They may be used in any environment and maybe stored without any difficulties. This training enabled young surgeons to familiarize and train different surgical techniques, including procedures for aortic diseases. In a subjective assessment, these surgeons reported that the training period led to improved surgical techniques in the surgical field. Conclusion The model described in this protocol is effective and low-cost when compared to existing simulators, enabling a large array of cardiovascular surgery training. PMID:28076623

  1. Mesothelioma and asbestos.

    PubMed

    Gibbs, Graham W; Berry, Geoffrey

    2008-10-01

    The current state of knowledge concerning mesothelioma risk estimates is reviewed. Estimates of the risk of mesothelioma exist for the commercial asbestos fiber types chrysotile, amosite and crocidolite. Data also exist on which to assess risks for winchite (sodic tremolite) and anthophyllite asbestos. Uncertainty in estimates is primarily related to limitations in measurements of exposure. Differences in the dimensions of the various fiber types and of the same fiber types at different stages of processing add a further complication. Never-the-less, in practical terms, crocidolite presents the highest asbestos related mesothelioma risk. The risk associated with sodic tremolite (winchite) appears to be similar. In chrysotile miners and millers, the mesothelioma risk has been linked with exposure to asbestiform tremolite. Exposure to chrysotile in a pure form seems likely to present a very low if any risk of mesothelioma. While the majority of mesothelial tumors result from exposure to the asbestos minerals, there are other well established and suspected etiological agents. While a practical threshold seems to exist for exposure to chrysotile, it is unlikely to exist for the amphibole asbestos minerals, especially for crocidolite. To date there is no indication of an increased risk of mesothelioma resulting from non-commercial fiber exposure in the taconite industry.

  2. Building Inventory Database on the Urban Scale Using GIS for Earthquake Risk Assessment

    NASA Astrophysics Data System (ADS)

    Kaplan, O.; Avdan, U.; Guney, Y.; Helvaci, C.

    2016-12-01

    The majority of the existing buildings are not safe against earthquakes in most of the developing countries. Before a devastating earthquake, existing buildings need to be assessed and the vulnerable ones must be determined. Determining the seismic performance of existing buildings which is usually made with collecting the attributes of existing buildings, making the analysis and the necessary queries, and producing the result maps is very hard and complicated procedure that can be simplified with Geographic Information System (GIS). The aim of this study is to produce a building inventory database using GIS for assessing the earthquake risk of existing buildings. In this paper, a building inventory database for 310 buildings, located in Eskisehir, Turkey, was produced in order to assess the earthquake risk of the buildings. The results from this study show that 26% of the buildings have high earthquake risk, 33% of the buildings have medium earthquake risk and the 41% of the buildings have low earthquake risk. The produced building inventory database can be very useful especially for governments in dealing with the problem of determining seismically vulnerable buildings in the large existing building stocks. With the help of this kind of methods, determination of the buildings, which may collapse and cause life and property loss during a possible future earthquake, will be very quick, cheap and reliable.

  3. Disease dynamics during wildlife translocations: disruptions to the host population and potential consequences for transmission in desert tortoise contact networks

    USGS Publications Warehouse

    Aiello, Christina M.; Nussear, Kenneth E.; Walde, Andrew D.; Esque, Todd C.; Emblidge, Patrick G.; Sah, Pratha; Bansal, S.; Hudson, Peter J.

    2014-01-01

    Wildlife managers consider animal translocation a means of increasing the viability of a local population. However, augmentation may disrupt existing resident disease dynamics and initiate an outbreak that would effectively offset any advantages the translocation may have achieved. This paper examines fundamental concepts of disease ecology and identifies the conditions that will increase the likelihood of a disease outbreak following translocation. We highlight the importance of susceptibility to infection, population size and population connectivity – a characteristic likely affected by translocation but not often considered in risk assessments – in estimating outbreak risk due to translocation. We then explore these features in a species of conservation concern often translocated in the presence of infectious disease, the Mojave Desert tortoise, and use data from experimental tortoise translocations to detect changes in population connectivity that may influence pathogen transmission. Preliminary analyses comparing contact networks inferred from spatial data at control and translocation plots and infection simulation results through these networks suggest increased outbreak risk following translocation due to dispersal-driven changes in contact frequency and network structure. We outline future research goals to test these concepts and aid managers in designing effective risk assessment and intervention strategies that will improve translocation success.

  4. Addressing Risk in the Valuation of Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Hammerstrom, Donald J.; Woodward, James T.

    2017-06-26

    Valuation is a mechanism by which potential worth of a transaction between two or more parties can be evaluated. Examples include valuation of transactive energy systems such as electric power system and building energy systems. Uncertainties can manifest while exercising a valuation methodology in the form of lack of knowledge or be inherently embedded in the valuation process. Uncertainty could also exist in the temporal dimension while planning for long-term growth. This paper discusses risk considerations associated with valuation studies in support of decision-making in the presence of such uncertainties. It is often important to have foresight of uncertain entitiesmore » that can impact real-world deployments, such as the comparison or ranking of two valuation studies to determine cost-benefit impacts to multiple stakeholders. The research proposes to address this challenge through simulation and sensitivity analyses to support ‘what-if’ analysis of well-defined future scenarios. This paper describes foundational value of diagrammatic representation techniques such as unified modeling language to understand the implications of not addressing some of the risk elements encountered during the valuation process. The paper includes examples from generation resource adequacy assessment studies (e.g. loss of load) to illustrate the principles of risk in valuation.« less

  5. Tempo in electronic gaming machines affects behavior among at-risk gamblers.

    PubMed

    Mentzoni, Rune A; Laberg, Jon Christian; Brunborg, Geir Scott; Molde, Helge; Pallesen, Ståle

    2012-09-01

    Background and aims Electronic gaming machines (EGM) may be a particularly addictive form of gambling, and gambling speed is believed to contribute to the addictive potential of such machines. The aim of the current study was to generate more knowledge concerning speed as a structural characteristic in gambling, by comparing the effects of three different bet-to-outcome intervals (BOI) on gamblers bet-sizes, game evaluations and illusion of control during gambling on a computer simulated slot machine. Furthermore, we investigated whether problem gambling moderates effects of BOI on gambling behavior and cognitions. Methods 62 participants played a computerized slot machine with either fast (400 ms), medium (1700 ms) or slow (3000 ms) BOI. SOGS-R was used to measure pre-existing gambling problems. Mean bet size, game evaluations and illusion of control comprised the dependent variables. Results Gambling speed had no overall effect on either mean bet size, game evaluations or illusion of control, but in the 400 ms condition, at-risk gamblers (SOGS-R score > 0) employed higher bet sizes compared to no-risk (SOGS-R score = 0) gamblers. Conclusions The findings corroborate and elaborate on previous studies and indicate that restrictions on gambling speed may serve as a harm reducing effort for at-risk gamblers.

  6. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    NASA Astrophysics Data System (ADS)

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.

    2012-12-01

    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. Tests show that the decoupled approach is both efficient and able to provide accurate uncertainty estimates. The method is demonstrated on a Danish field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the co-simulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.

  7. SIMulation of Medication Error induced by Clinical Trial drug labeling: the SIMME-CT study.

    PubMed

    Dollinger, Cecile; Schwiertz, Vérane; Sarfati, Laura; Gourc-Berthod, Chloé; Guédat, Marie-Gabrielle; Alloux, Céline; Vantard, Nicolas; Gauthier, Noémie; He, Sophie; Kiouris, Elena; Caffin, Anne-Gaelle; Bernard, Delphine; Ranchon, Florence; Rioufol, Catherine

    2016-06-01

    To assess the impact of investigational drug labels on the risk of medication error in drug dispensing. A simulation-based learning program focusing on investigational drug dispensing was conducted. The study was undertaken in an Investigational Drugs Dispensing Unit of a University Hospital of Lyon, France. Sixty-three pharmacy workers (pharmacists, residents, technicians or students) were enrolled. Ten risk factors were selected concerning label information or the risk of confusion with another clinical trial. Each risk factor was scored independently out of 5: the higher the score, the greater the risk of error. From 400 labels analyzed, two groups were selected for the dispensing simulation: 27 labels with high risk (score ≥3) and 27 with low risk (score ≤2). Each question in the learning program was displayed as a simulated clinical trial prescription. Medication error was defined as at least one erroneous answer (i.e. error in drug dispensing). For each question, response times were collected. High-risk investigational drug labels correlated with medication error and slower response time. Error rates were significantly 5.5-fold higher for high-risk series. Error frequency was not significantly affected by occupational category or experience in clinical trials. SIMME-CT is the first simulation-based learning tool to focus on investigational drug labels as a risk factor for medication error. SIMME-CT was also used as a training tool for staff involved in clinical research, to develop medication error risk awareness and to validate competence in continuing medical education. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  8. Laparoscopic cholecystectomy poses physical injury risk to surgeons: analysis of hand technique and standing position.

    PubMed

    Youssef, Yassar; Lee, Gyusung; Godinez, Carlos; Sutton, Erica; Klein, Rosemary V; George, Ivan M; Seagull, F Jacob; Park, Adrian

    2011-07-01

    This study compares surgical techniques and surgeon's standing position during laparoscopic cholecystectomy (LC), investigating each with respect to surgeons' learning, performance, and ergonomics. Little homogeneity exists in LC performance and training. Variations in standing position (side-standing technique vs. between-standing technique) and hand technique (one-handed vs. two-handed) exist. Thirty-two LC procedures performed on a virtual reality simulator were video-recorded and analyzed. Each subject performed four different procedures: one-handed/side-standing, one-handed/between-standing, two-handed/side-standing, and two-handed/between-standing. Physical ergonomics were evaluated using Rapid Upper Limb Assessment (RULA). Mental workload assessment was acquired with the National Aeronautics and Space Administration-Task Load Index (NASA-TLX). Virtual reality (VR) simulator-generated performance evaluation and a subjective survey were analyzed. RULA scores were consistently lower (indicating better ergonomics) for the between-standing technique and higher (indicating worse ergonomics) for the side-standing technique, regardless of whether one- or two-handed. Anatomical scores overall showed side-standing to have a detrimental effect on the upper arms and trunk. The NASA-TLX showed significant association between the side-standing position and high physical demand, effort, and frustration (p<0.05). The two-handed technique in the side-standing position required more effort than the one-handed (p<0.05). No difference in operative time or complication rate was demonstrated among the four procedures. The two-handed/between-standing method was chosen as the best procedure to teach and standardize. Laparoscopic cholecystectomy poses a risk of physical injury to the surgeon. As LC is currently commonly performed in the United States, the left side-standing position may lead to increased physical demand and effort, resulting in ergonomically unsound conditions for the surgeon. Though further investigations should be conducted, adopting the between-standing position deserves serious consideration as it may be the best short-term ergonomic alternative.

  9. Adapting water treatment design and operations to the impacts of global climate change

    NASA Astrophysics Data System (ADS)

    Clark, Robert M.; Li, Zhiwei; Buchberger, Steven G.

    2011-12-01

    It is anticipated that global climate change will adversely impact source water quality in many areas of the United States and will therefore, potentially, impact the design and operation of current and future water treatment systems. The USEPA has initiated an effort called the Water Resources Adaptation Program (WRAP) which is intended to develop tools and techniques that can assess the impact of global climate change on urban drinking water and wastewater infrastructure. A three step approach for assessing climate change impacts on water treatment operation and design is being persude in this effort. The first step is the stochastic characterization of source water quality, the second step is the application of the USEPA Water Treatment Plant model and the third step is the application of cost algorithms to provide a metric that can be used to assess the coat impact of climate change. A model has been validated using data collected from Cincinnati's Richard Miller Water Treatment Plant for the USEPA Information Collection Rule (ICR) database. An analysis of the water treatment processes in response to assumed perturbations in raw water quality identified TOC, pH, and bromide as the three most important parameters affecting performance of the Miller WTP. The Miller Plant was simulated using the EPA WTP model to examine the impact of these parameters on selected regulated water quality parameters. Uncertainty in influent water quality was analyzed to estimate the risk of violating drinking water maximum contaminant levels (MCLs).Water quality changes in the Ohio River were projected for 2050 using Monte Carlo simulation and the WTP model was used to evaluate the effects of water quality changes on design and operation. Results indicate that the existing Miller WTP might not meet Safe Drinking Water Act MCL requirements for certain extreme future conditions. However, it was found that the risk of MCL violations under future conditions could be controlled by enhancing existing WTP design and operation or by process retrofitting and modification.

  10. Bayesian Approach for Flexible Modeling of Semicompeting Risks Data

    PubMed Central

    Han, Baoguang; Yu, Menggang; Dignam, James J.; Rathouz, Paul J.

    2016-01-01

    Summary Semicompeting risks data arise when two types of events, non-terminal and terminal, are observed. When the terminal event occurs first, it censors the non-terminal event, but not vice versa. To account for possible dependent censoring of the non-terminal event by the terminal event and to improve prediction of the terminal event using the non-terminal event information, it is crucial to model their association properly. Motivated by a breast cancer clinical trial data analysis, we extend the well-known illness-death models to allow flexible random effects to capture heterogeneous association structures in the data. Our extension also represents a generalization of the popular shared frailty models that usually assume that the non-terminal event does not affect the hazards of the terminal event beyond a frailty term. We propose a unified Bayesian modeling approach that can utilize existing software packages for both model fitting and individual specific event prediction. The approach is demonstrated via both simulation studies and a breast cancer data set analysis. PMID:25274445

  11. A Robust Statistics Approach to Minimum Variance Portfolio Optimization

    NASA Astrophysics Data System (ADS)

    Yang, Liusha; Couillet, Romain; McKay, Matthew R.

    2015-12-01

    We study the design of portfolios under a minimum risk criterion. The performance of the optimized portfolio relies on the accuracy of the estimated covariance matrix of the portfolio asset returns. For large portfolios, the number of available market returns is often of similar order to the number of assets, so that the sample covariance matrix performs poorly as a covariance estimator. Additionally, financial market data often contain outliers which, if not correctly handled, may further corrupt the covariance estimation. We address these shortcomings by studying the performance of a hybrid covariance matrix estimator based on Tyler's robust M-estimator and on Ledoit-Wolf's shrinkage estimator while assuming samples with heavy-tailed distribution. Employing recent results from random matrix theory, we develop a consistent estimator of (a scaled version of) the realized portfolio risk, which is minimized by optimizing online the shrinkage intensity. Our portfolio optimization method is shown via simulations to outperform existing methods both for synthetic and real market data.

  12. Simulating human behavior for national security human interactions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.

    2007-01-01

    This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less

  13. The Role of Pre-Existing Diabetes Mellitus on Hepatocellular Carcinoma Occurrence and Prognosis: A Meta-Analysis of Prospective Cohort Studies

    PubMed Central

    Bray, Freddie; Gao, Shan; Gao, Jing; Li, Hong-Lan; Xiang, Yong-Bing

    2011-01-01

    Background The impact of pre-existing diabetes mellitus (DM) on hepatocellular carcinoma (HCC) occurrence and prognosis is complex and unclear. The aim of this meta-analysis is to evaluate the association between pre-existing diabetes mellitus and hepatocellular carcinoma occurrence and prognosis. Methods We searched PubMed, Embase and the Cochrane Library from their inception to January, 2011 for prospective epidemiological studies assessing the effect of pre-existing diabetes mellitus on hepatocellular carcinoma occurrence, mortality outcomes, cancer recurrence, and treatment-related complications. Study-specific risk estimates were combined by using fixed effect or random effect models. Results The database search generated a total of 28 prospective studies that met the inclusion criteria. Among these studies, 14 reported the risk of HCC incidence and 6 studies reported risk of HCC specific mortality. Six studies provided a total of 8 results for all-cause mortality in HCC patients. Four studies documented HCC recurrence risks and 2 studies reported risks for hepatic decomposition occurrence in HCC patients. Meta-analysis indicated that pre-existing diabetes mellitus (DM) was significantly associated with increased risk of HCC incidence [meta-relative risk (RR) = 1.87, 95% confidence interval (CI): 1.15–2.27] and HCC-specific mortality (meta-RR = 1.88, 95%CI: 1.39–2.55) compared with their non-DM counterparts. HCC patients with pre-existing DM had a 38% increased (95% CI: 1.13–1.48) risk of death from all-causes and 91% increased (95%CI: 1.41–2.57) risk of hepatic decomposition occurrence compared to those without DM. In DM patients, the meta-RR for HCC recurrence-free survival was 1.93(95%CI: 1.12–3.33) compared with non-diabetic patients. Conclusion The findings from the current meta-analysis suggest that DM may be both associated with elevated risks of both HCC incidence and mortality. Furthermore, HCC patients with pre-existing diabetes have a poorer prognosis relative to their non-diabetic counterparts. PMID:22205924

  14. Can small island mountains provide relief from the Subtropical Precipitation Decline? Simulating future precipitation regimes for small island nations using high resolution Regional Climate Models.

    NASA Astrophysics Data System (ADS)

    Bowden, J.; Terando, A. J.; Misra, V.; Wootten, A.

    2017-12-01

    Small island nations are vulnerable to changes in the hydrologic cycle because of their limited water resources. This risk to water security is likely even higher in sub-tropical regions where anthropogenic forcing of the climate system is expected to lead to a drier future (the so-called `dry-get-drier' pattern). However, high-resolution numerical modeling experiments have also shown an enhancement of existing orographically-influenced precipitation patterns on islands with steep topography, potentially mitigating subtropical drying on windward mountain sides. Here we explore the robustness of the near-term (25-45 years) subtropical precipitation decline (SPD) across two island groupings in the Caribbean, Puerto Rico and the U.S. Virgin Islands. These islands, forming the boundary between the Greater and Lesser Antilles, significantly differ in size, topographic relief, and orientation to prevailing winds. Two 2-km horizontal resolution regional climate model simulations are used to downscale a total of three different GCMs under the RCP8.5 emissions scenario. Results indicate some possibility for modest increases in precipitation at the leading edge of the Luquillo Mountains in Puerto Rico, but consistent declines elsewhere. We conclude with a discussion of potential explanations for these patterns and the attendant risks to water security that subtropical small island nations could face as the climate warms.

  15. Magnetic navigation behavior and the oceanic ecology of young loggerhead sea turtles.

    PubMed

    Putman, Nathan F; Verley, Philippe; Endres, Courtney S; Lohmann, Kenneth J

    2015-04-01

    During long-distance migrations, animals navigate using a variety of sensory cues, mechanisms and strategies. Although guidance mechanisms are usually studied under controlled laboratory conditions, such methods seldom allow for navigation behavior to be examined in an environmental context. Similarly, although realistic environmental models are often used to investigate the ecological implications of animal movement, explicit consideration of navigation mechanisms in such models is rare. Here, we used an interdisciplinary approach in which we first conducted lab-based experiments to determine how hatchling loggerhead sea turtles (Caretta caretta) respond to magnetic fields that exist at five widely separated locations along their migratory route, and then studied the consequences of the observed behavior by simulating it within an ocean circulation model. Magnetic fields associated with two geographic regions that pose risks to young turtles (due to cold wintertime temperatures or potential displacement from the migratory route) elicited oriented swimming, whereas fields from three locations where surface currents and temperature pose no such risk did not. Additionally, at locations with fields that elicited oriented swimming, simulations indicate that the observed behavior greatly increases the likelihood of turtles advancing along the migratory pathway. Our findings suggest that the magnetic navigation behavior of sea turtles is intimately tied to their oceanic ecology and is shaped by a complex interplay between ocean circulation and geomagnetic dynamics. © 2015. Published by The Company of Biologists Ltd.

  16. Driver responses to differing urban work zone configurations.

    PubMed

    Morgan, J F; Duley, A R; Hancock, P A

    2010-05-01

    This study reports the results of a simulator-based assessment of driver response to two different urban highway work zone configurations. One configuration represented an existing design which was contrasted with a second configuration that presented a reduced taper length prototype work zone design. Twenty-one drivers navigated the two different work zones in two different conditions, one with and one without a lead vehicle; in this case a bus. Measures of driver speed, braking, travel path, and collision frequency were recorded. Drivers navigated significantly closer to the boundary of the work area in the reduced taper length design. This proximity effect was moderated by the significant interaction between lead vehicle and taper length and such interactive effects were also observed for driver speed at the end of the work zone and the number of collisions observed within the work zone itself. These results suggest that reduced taper length poses an increase in risk to both drivers and work zone personnel, primarily when driver anticipation is reduced by foreshortened viewing distances. Increase in such risk is to a degree offset by the reduction of overall exposure to the work zone that a foreshortened taper creates. The benefits and limitations to a simulation-based approach to the assessment and prediction of driver behavior in different work zone configurations are also discussed. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  17. Simulated Driving Assessment (SDA) for Teen Drivers: Results from a Validation Study

    PubMed Central

    McDonald, Catherine C.; Kandadai, Venk; Loeb, Helen; Seacrist, Thomas S.; Lee, Yi-Ching; Winston, Zachary; Winston, Flaura K.

    2015-01-01

    Background Driver error and inadequate skill are common critical reasons for novice teen driver crashes, yet few validated, standardized assessments of teen driving skills exist. The purpose of this study was to evaluate the construct and criterion validity of a newly developed Simulated Driving Assessment (SDA) for novice teen drivers. Methods The SDA's 35-minute simulated drive incorporates 22 variations of the most common teen driver crash configurations. Driving performance was compared for 21 inexperienced teens (age 16–17 years, provisional license ≤90 days) and 17 experienced adults (age 25–50 years, license ≥5 years, drove ≥100 miles per week, no collisions or moving violations ≤3 years). SDA driving performance (Error Score) was based on driving safety measures derived from simulator and eye-tracking data. Negative driving outcomes included simulated collisions or run-off-the-road incidents. A professional driving evaluator/instructor reviewed videos of SDA performance (DEI Score). Results The SDA demonstrated construct validity: 1.) Teens had a higher Error Score than adults (30 vs. 13, p=0.02); 2.) For each additional error committed, the relative risk of a participant's propensity for a simulated negative driving outcome increased by 8% (95% CI: 1.05–1.10, p<0.01). The SDA demonstrated criterion validity: Error Score was correlated with DEI Score (r=−0.66, p<0.001). Conclusions This study supports the concept of validated simulated driving tests like the SDA to assess novice driver skill in complex and hazardous driving scenarios. The SDA, as a standard protocol to evaluate teen driver performance, has the potential to facilitate screening and assessment of teen driving readiness and could be used to guide targeted skill training. PMID:25740939

  18. Engineering design and integration simulation utilization manual

    NASA Technical Reports Server (NTRS)

    Hirsch, G. N.

    1976-01-01

    A description of the Engineering Design Integration (EDIN) Simulation System as it exists at Johnson Space Center is provided. A discussion of the EDIN Simulation System capabilities and applications is presented.

  19. Risk management in medical product development process using traditional FMEA and fuzzy linguistic approach: a case study

    NASA Astrophysics Data System (ADS)

    Kirkire, Milind Shrikant; Rane, Santosh B.; Jadhav, Jagdish Rajaram

    2015-12-01

    Medical product development (MPD) process is highly multidisciplinary in nature, which increases the complexity and the associated risks. Managing the risks during MPD process is very crucial. The objective of this research is to explore risks during MPD in a dental product manufacturing company and propose a model for risk mitigation during MPD process to minimize failure events. A case study approach is employed. The existing MPD process is mapped with five phases of the customized phase gate process. The activities during each phase of development and risks associated with each activity are identified and categorized based on the source of occurrence. The risks are analyzed using traditional Failure mode and effect analysis (FMEA) and fuzzy FMEA. The results of two methods when compared show that fuzzy approach avoids the duplication of RPNs and helps more to convert cognition of experts into information to get values of risk factors. The critical, moderate, low level and negligible risks are identified based on criticality; risk treatments and mitigation model are proposed. During initial phases of MPD, the risks are less severe, but as the process progresses the severity of risks goes on increasing. The MPD process should be critically designed and simulated to minimize the number of risk events and their severity. To successfully develop the products/devices within the manufacturing companies, the process risk management is very essential. A systematic approach to manage risks during MPD process will lead to the development of medical products with expected quality and reliability. This is the first research of its kind having focus on MPD process risks and its management. The methodology adopted in this paper will help the developers, managers and researchers to have a competitive edge over the other companies by managing the risks during the development process.

  20. The economics of mitigation and remediation measures - preliminary results

    NASA Astrophysics Data System (ADS)

    Wiedemann, Carsten; Flegel, Sven Kevin; Vörsmann, Peter; Gelhaus, Johannes; Moeckel, Marek; Braun, Vitali; Kebschull, Christopher; Metz, Manuel

    2012-07-01

    Today there exists a high spatial density of orbital debris objects at about 800 km altitude. The control of the debris population in this region is important for the long-term evolution of the debris environment. The future debris population is investigated by simulations using the software tool LUCA (Long-Term Orbit Utilization Collision Analysis). It is likely that in the future there will occur more catastrophic collisions. Debris objects generated during such events may again trigger further catastrophic collisions. Current simulations have revealed that the number of debris objects will increase in the future. In a long-term perspective, catastrophic collisions may become the dominating mechanism in generating orbital debris. In this study it is investigated, when the situation will become unstable. To prevent this instability it is necessary to implement mitigation and maybe even remediation measures. It is investigated how these measures affect the future debris environment. It is simulated if the growth of the number of debris objects can be interrupted and how much this may cost. Different mitigation scenarios are considered. Furthermore also one remediation measure, the active removal of high-risk objects, is simulated. Cost drivers for the different measures are identified. It is investigated how selected measures are associated with costs. The goal is to find out which economic benefits may result from mitigation or remediation. First results of a cost benefit analyses are presented.

  1. Flight simulator for hypersonic vehicle and a study of NASP handling qualities

    NASA Technical Reports Server (NTRS)

    Ntuen, Celestine A.; Park, Eui H.; Deeb, Joseph M.; Kim, Jung H.

    1992-01-01

    The research goal of the Human-Machine Systems Engineering Group was to study the existing handling quality studies in aircraft with sonic to supersonic speeds and power in order to understand information requirements needed for a hypersonic vehicle flight simulator. This goal falls within the NASA task statements: (1) develop flight simulator for hypersonic vehicle; (2) study NASP handling qualities; and (3) study effects of flexibility on handling qualities and on control system performance. Following the above statement of work, the group has developed three research strategies. These are: (1) to study existing handling quality studies and the associated aircraft and develop flight simulation data characterization; (2) to develop a profile for flight simulation data acquisition based on objective statement no. 1 above; and (3) to develop a simulator and an embedded expert system platform which can be used in handling quality experiments for hypersonic aircraft/flight simulation training.

  2. Preliminary validation of a new methodology for estimating dose reduction protocols in neonatal chest computed radiographs

    NASA Astrophysics Data System (ADS)

    Don, Steven; Whiting, Bruce R.; Hildebolt, Charles F.; Sehnert, W. James; Ellinwood, Jacquelyn S.; Töpfer, Karin; Masoumzadeh, Parinaz; Kraus, Richard A.; Kronemer, Keith A.; Herman, Thomas; McAlister, William H.

    2006-03-01

    The risk of radiation exposure is greatest for pediatric patients and, thus, there is a great incentive to reduce the radiation dose used in diagnostic procedures for children to "as low as reasonably achievable" (ALARA). Testing of low-dose protocols presents a dilemma, as it is unethical to repeatedly expose patients to ionizing radiation in order to determine optimum protocols. To overcome this problem, we have developed a computed-radiography (CR) dose-reduction simulation tool that takes existing images and adds synthetic noise to create realistic images that correspond to images generated with lower doses. The objective of our study was to determine the extent to which simulated, low-dose images corresponded with original (non-simulated) low-dose images. To make this determination, we created pneumothoraces of known volumes in five neonate cadavers and obtained images of the neonates at 10 mR, 1 mR and 0.1 mR (as measured at the cassette plate). The 10-mR exposures were considered "relatively-noise-free" images. We used these 10 mR-images and our simulation tool to create simulated 0.1- and 1-mR images. For the simulated and original images, we identified regions of interest (ROI) of the entire chest, free-in-air region, and liver. We compared the means and standard deviations of the ROI grey-scale values of the simulated and original images with paired t tests. We also had observers rate simulated and original images for image quality and for the presence or absence of pneumothoraces. There was no statistically significant difference in grey-scale-value means nor standard deviations between simulated and original entire chest ROI regions. The observer performance suggests that an exposure >=0.2 mR is required to detect the presence or absence of pneumothoraces. These preliminary results indicate that the use of the simulation tool is promising for achieving ALARA exposures in children.

  3. Integrating Existing Simulation Components into a Cohesive Simulation System

    NASA Technical Reports Server (NTRS)

    McLaughlin, Brian J.; Barrett, Larry K.

    2012-01-01

    A tradition of leveraging the re-use of components to help manage costs has evolved in the development of complex system. This tradition continues on in the Joint Polar Satellite System (JPSS) Program with the cloning of the Suomi National Polar-orbiting Partnership (NPP) satellite for the JPSS-1 mission, including the instrument complement. One benefit of re-use on a mission is the availability of existing simulation assets from the systems that were previously built. An issue arises in the continual shift of technology over a long mission, or multi-mission, lifecycle. As the missions mature, the requirements for the observatory simulations evolve. The challenge in this environment becomes re-using the existing components in that ever-changing landscape. To meet this challenge, the system must: establish an operational architecture that minimizes impacts on the implementation of individual components, consolidate the satisfaction of new high-impact requirements into system-level infrastructure, and build in a long-term view of system adaptation that spans the full lifecycle of the simulation system. The Flight Vehicle Test Suite (FVTS) within the JPSS Program is defining and executing this approach to ensure a robust simulation capability for the JPSS multi-mission environment

  4. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  5. Existence and numerical simulation of periodic traveling wave solutions to the Casimir equation for the Ito system

    NASA Astrophysics Data System (ADS)

    Abbasbandy, S.; Van Gorder, R. A.; Hajiketabi, M.; Mesrizadeh, M.

    2015-10-01

    We consider traveling wave solutions to the Casimir equation for the Ito system (a two-field extension of the KdV equation). These traveling waves are governed by a nonlinear initial value problem with an interesting nonlinearity (which actually amplifies in magnitude as the size of the solution becomes small). The nonlinear problem is parameterized by two initial constant values, and we demonstrate that the existence of solutions is strongly tied to these parameter values. For our interests, we are concerned with positive, bounded, periodic wave solutions. We are able to classify parameter regimes which admit such solutions in full generality, thereby obtaining a nice existence result. Using the existence result, we are then able to numerically simulate the positive, bounded, periodic solutions. We elect to employ a group preserving scheme in order to numerically study these solutions, and an outline of this approach is provided. The numerical simulations serve to illustrate the properties of these solutions predicted analytically through the existence result. Physically, these results demonstrate the existence of a type of space-periodic structure in the Casimir equation for the Ito model, which propagates as a traveling wave.

  6. Peer Passenger Norms and Pressure: Experimental Effects on Simulated Driving Among Teenage Males.

    PubMed

    Bingham, C Raymond; Simons-Morton, Bruce G; Pradhan, Anuj K; Li, Kaigang; Almani, Farideh; Falk, Emily B; Shope, Jean T; Buckley, Lisa; Ouimet, Marie Claude; Albert, Paul S

    2016-08-01

    Serious crashes are more likely when teenage drivers have teenage passengers. One likely source of this increased risk is social influences on driving performance. This driving simulator study experimentally tested the effects of peer influence (i.e., risk-accepting compared to risk-averse peer norms reinforced by pressure) on the driving risk behavior (i.e., risky driving behavior and inattention to hazards) of male teenagers. It was hypothesized that peer presence would result in greater driving risk behavior (i.e., increased driving risk and reduced latent hazard anticipation), and that the effect would be greater when the peer was risk-accepting. Fifty-three 16- and 17-year-old male participants holding a provisional U.S., State of Michigan driver license were randomized to either a risk-accepting or risk-averse condition. Each participant operated a driving simulator while alone and separately with a confederate peer passenger. The simulator world included scenarios designed to elicit variation in driving risk behavior with a teen passenger present in the vehicle. Significant interactions of passenger presence (passenger present vs. alone) by risk condition (risk-accepting vs. risk-averse) were observed for variables measuring: failure to stop at yellow light intersections (Incident Rate Ratio (IRR)=2.16; 95% Confidence Interval [95CI]=1.06, 4.43); higher probability of overtaking (IRR=10.17; 95CI=1.43, 73.35); shorter left turn latency (IRR=0.43; 95CI=0.31,0.60); and, failure to stop at an intersection with an occluded stop sign (IRR=7.90; 95CI=2.06,30.35). In all cases, greater risky driving by participants was more likely with a risk-accepting passenger versus a risk-averse passenger present and a risk-accepting passenger present versus driving alone. Exposure of male teenagers to a risk-accepting confederate peer passenger who applied peer influence increased simulated risky driving behavior compared with exposure to a risk-averse confederate peer passenger or driving alone. These results are consistent with the contention that variability in teenage risky driving is in part explained by social influences.

  7. Peer Passenger Norms and Pressure: Experimental Effects on Simulated Driving Among Teenage Males

    PubMed Central

    Bingham, C. Raymond; Simons-Morton, Bruce G.; Pradhan, Anuj K.; Li, Kaigang; Almani, Farideh; Falk, Emily B.; Shope, Jean T.; Buckley, Lisa; Ouimet, Marie Claude; Albert, Paul S.

    2016-01-01

    Objective Serious crashes are more likely when teenage drivers have teenage passengers. One likely source of this increased risk is social influences on driving performance. This driving simulator study experimentally tested the effects of peer influence (i.e., risk-accepting compared to risk-averse peer norms reinforced by pressure) on the driving risk behavior (i.e., risky driving behavior and inattention to hazards) of male teenagers. It was hypothesized that peer presence would result in greater driving risk behavior (i.e., increased driving risk and reduced latent hazard anticipation), and that the effect would be greater when the peer was risk-accepting. Methods Fifty-three 16- and 17-year-old male participants holding a provisional U.S., State of Michigan driver license were randomized to either a risk-accepting or risk-averse condition. Each participant operated a driving simulator while alone and separately with a confederate peer passenger. The simulator world included scenarios designed to elicit variation in driving risk behavior with a teen passenger present in the vehicle. Results Significant interactions of passenger presence (passenger present vs. alone) by risk condition (risk-accepting vs. risk-averse) were observed for variables measuring: failure to stop at yellow light intersections (Incident Rate Ratio (IRR)=2.16; 95% Confidence Interval [95CI]=1.06, 4.43); higher probability of overtaking (IRR=10.17; 95CI=1.43, 73.35); shorter left turn latency (IRR=0.43; 95CI=0.31,0.60); and, failure to stop at an intersection with an occluded stop sign (IRR=7.90; 95CI=2.06,30.35). In all cases, greater risky driving by participants was more likely with a risk-accepting passenger versus a risk-averse passenger present and a risk-accepting passenger present versus driving alone. Conclusions Exposure of male teenagers to a risk-accepting confederate peer passenger who applied peer influence increased simulated risky driving behavior compared with exposure to a risk-averse confederate peer passenger or driving alone. These results are consistent with the contention that variability in teenage risky driving is in part explained by social influences. PMID:27818610

  8. Obstetric simulation as a risk control strategy: course design and evaluation.

    PubMed

    Gardner, Roxane; Walzer, Toni B; Simon, Robert; Raemer, Daniel B

    2008-01-01

    Patient safety initiatives aimed at reducing medical errors and adverse events are being implemented in Obstetrics. The Controlled Risk Insurance Company (CRICO), Risk Management Foundation (RMF) of the Harvard Medical Institutions pursued simulation as an anesthesia risk control strategy. Encouraged by their success, CRICO/RMF promoted simulation-based team training as a risk control strategy for obstetrical providers. We describe the development, implementation, and evaluation of an obstetric simulation-based team training course grounded in crisis resource management (CRM) principles. We pursued systematic design of course development, implementation, and evaluation in 3 phases, including a 1-year or more posttraining follow-up with self-assessment questionnaires. The course was highly rated overall by participants immediately after the course and 1-year or more after the course. Most survey responders reported having experienced a critical clinical event since the course and that various aspects of their teamwork had significantly or somewhat improved as a result of the course. Most (86%) reported CRM principles as useful for obstetric faculty and most (59%) recommended repeating the simulation course every 2 years. A simulation-based team-training course for obstetric clinicians was developed and is a central component of CRICO/RMF's obstetric risk management incentive program that provides a 10% reduction in annual obstetrical malpractice premiums. The course was highly regarded immediately and 1 year or more after completing the course. Most survey responders reported improved teamwork and communication in managing a critical obstetric event in the interval since taking the course. Simulation-based CRM training can serve as a strategy for mitigating adverse perinatal events.

  9. A new risk assessment approach for the prioritization of 500 classical and emerging organic microcontaminants as potential river basin specific pollutants under the European Water Framework Directive.

    PubMed

    von der Ohe, Peter Carsten; Dulio, Valeria; Slobodnik, Jaroslav; De Deckere, Eric; Kühne, Ralph; Ebert, Ralf-Uwe; Ginebreda, Antoni; De Cooman, Ward; Schüürmann, Gerrit; Brack, Werner

    2011-05-01

    Given the huge number of chemicals released into the environment and existing time and budget constraints, there is a need to prioritize chemicals for risk assessment and monitoring in the context of the European Union Water Framework Directive (EU WFD). This study is the first to assess the risk of 500 organic substances based on observations in the four European river basins of the Elbe, Scheldt, Danube and Llobregat. A decision tree is introduced that first classifies chemicals into six categories depending on the information available, which allows water managers to focus on the next steps (e.g. derivation of Environmental Quality Standards (EQS), improvement of analytical methods, etc.). The priority within each category is then evaluated based on two indicators, the Frequency of Exceedance and the Extent of Exceedance of Predicted No-Effect Concentrations (PNECs). These two indictors are based on maximum environmental concentrations (MEC), rather than the commonly used statistically based averages (Predicted Effect Concentration, PEC), and compared to the lowest acute-based (PNEC(acute)) or chronic-based thresholds (PNEC(chronic)). For 56% of the compounds, PNECs were available from existing risk assessments, and the majority of these PNECs were derived from chronic toxicity data or simulated ecosystem studies (mesocosm) with rather low assessment factors. The limitations of this concept for risk assessment purposes are discussed. For the remainder, provisional PNECs (P-PNECs) were established from read-across models for acute toxicity to the standard test organisms Daphnia magna, Pimephales promelas and Selenastrum capricornutum. On the one hand, the prioritization revealed that about three-quarter of the 44 substances with MEC/PNEC ratios above ten were pesticides. On the other hand, based on the monitoring data used in this study, no risk with regard to the water phase could be found for eight of the 41 priority substances, indicating a first success of the implementation of the WFD in the investigated river basins. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Potential effects of existing and proposed groundwater withdrawals on water levels and natural groundwater discharge in Snake Valley and surrounding areas, Utah and Nevada

    USGS Publications Warehouse

    Masbruch, Melissa D.; Brooks, Lynette E.

    2017-04-14

    Several U.S. Department of Interior (DOI) agencies are concerned about the cumulative effects of groundwater development on groundwater resources managed by, and other groundwater resources of interest to, these agencies in Snake Valley and surrounding areas. The new water uses that potentially concern the DOI agencies include 12 water-right applications filed in 2005, totaling approximately 8,864 acre-feet per year. To date, only one of these applications has been approved and partially developed. In addition, the DOI agencies are interested in the potential effects of three new water-right applications (UT 18-756, UT 18-758, and UT 18-759) and one water-right change application (UT a40687), which were the subject of a water-right hearing on April 19, 2016.This report presents a hydrogeologic analysis of areas in and around Snake Valley to assess potential effects of existing and future groundwater development on groundwater resources, specifically groundwater discharge sites, of interest to the DOI agencies. A previously developed steady-state numerical groundwater-flow model was modified to transient conditions with respect to well withdrawals and used to quantify drawdown and capture (withdrawals that result in depletion) of natural discharge from existing and proposed groundwater withdrawals. The original steady-state model simulates and was calibrated to 2009 conditions. To investigate the potential effects of existing and proposed groundwater withdrawals on the groundwater resources of interest to the DOI agencies, 10 withdrawal scenarios were simulated. All scenarios were simulated for periods of 5, 10, 15, 30, 55, and 105 years from the start of 2010; additionally, all scenarios were simulated to a new steady state to determine the ultimate long-term effects of the withdrawals. Capture maps were also constructed as part of this analysis. The simulations used to develop the capture maps test the response of the system, specifically the reduction of natural discharge, to future stresses at a point in the area represented by the model. In this way, these maps can be used as a tool to determine the source of water to, and potential effects at specific areas from, future well withdrawals.Downward trends in water levels measured in wells indicate that existing groundwater withdrawals in Snake Valley are affecting water levels. The numerical model simulates similar downward trends in water levels; simulated drawdowns in the model, however, are generally less than observed water-level declines. At the groundwater discharge sites of interest to the DOI agencies, simulated drawdowns from existing well withdrawals (projected into the future) range from 0 to about 50 feet. Following the addition of the proposed withdrawals, simulated drawdowns at some sites increase by 25 feet. Simulated drawdown resulting from the proposed withdrawals began in as few as 5 years after 2014 at several of the sites. At the groundwater discharge sites of interest to the DOI agencies, simulated capture of natural discharge resulting from the existing withdrawals ranged from 0 to 87 percent. Following the addition of the proposed withdrawals, simulated capture at several of the sites reached 100 percent, indicating that groundwater discharge at that site would cease. Simulated capture following the addition of the proposed withdrawals increased in as few as 5 years after 2014 at several of the sites.

  11. A preliminary approach to quantifying the overall environmental risks posed by development projects during environmental impact assessment

    PubMed Central

    Chadès, Iadine

    2017-01-01

    Environmental impact assessment (EIA) is used globally to manage the impacts of development projects on the environment, so there is an imperative to demonstrate that it can effectively identify risky projects. However, despite the widespread use of quantitative predictive risk models in areas such as toxicology, ecosystem modelling and water quality, the use of predictive risk tools to assess the overall expected environmental impacts of major construction and development proposals is comparatively rare. A risk-based approach has many potential advantages, including improved prediction and attribution of cause and effect; sensitivity analysis; continual learning; and optimal resource allocation. In this paper we investigate the feasibility of using a Bayesian belief network (BBN) to quantify the likelihood and consequence of non-compliance of new projects based on the occurrence probabilities of a set of expert-defined features. The BBN incorporates expert knowledge and continually improves its predictions based on new data as it is collected. We use simulation to explore the trade-off between the number of data points and the prediction accuracy of the BBN, and find that the BBN could predict risk with 90% accuracy using approximately 1000 data points. Although a further pilot test with real project data is required, our results suggest that a BBN is a promising method to monitor overall risks posed by development within an existing EIA process given a modest investment in data collection. PMID:28686651

  12. A preliminary approach to quantifying the overall environmental risks posed by development projects during environmental impact assessment.

    PubMed

    Nicol, Sam; Chadès, Iadine

    2017-01-01

    Environmental impact assessment (EIA) is used globally to manage the impacts of development projects on the environment, so there is an imperative to demonstrate that it can effectively identify risky projects. However, despite the widespread use of quantitative predictive risk models in areas such as toxicology, ecosystem modelling and water quality, the use of predictive risk tools to assess the overall expected environmental impacts of major construction and development proposals is comparatively rare. A risk-based approach has many potential advantages, including improved prediction and attribution of cause and effect; sensitivity analysis; continual learning; and optimal resource allocation. In this paper we investigate the feasibility of using a Bayesian belief network (BBN) to quantify the likelihood and consequence of non-compliance of new projects based on the occurrence probabilities of a set of expert-defined features. The BBN incorporates expert knowledge and continually improves its predictions based on new data as it is collected. We use simulation to explore the trade-off between the number of data points and the prediction accuracy of the BBN, and find that the BBN could predict risk with 90% accuracy using approximately 1000 data points. Although a further pilot test with real project data is required, our results suggest that a BBN is a promising method to monitor overall risks posed by development within an existing EIA process given a modest investment in data collection.

  13. 2007 Lunar Regolith Simulant Workshop Overview

    NASA Technical Reports Server (NTRS)

    McLemore, Carole A.; Fikes, John C.; Howell, Joe T.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) vision has as a cornerstone, the establishment of an Outpost on the Moon. This Lunar Outpost will eventually provide the necessary planning, technology development, and training for a manned mission to Mars in the future. As part of the overall activity, NASA is conducting Earth-based research and advancing technologies to a Technology Readiness Level (TRL) 6 maturity under the Exploration Technology Development Program that will be incorporated into the Constellation Project as well as other projects. All aspects of the Lunar environment, including the Lunar regolith and its properties, are important in understanding the long-term impacts to hardware, scientific instruments, and humans prior to returning to the Moon and living on the Moon. With the goal of reducing risk to humans and hardware and increasing mission success on the Lunar surface, it is vital that terrestrial investigations including both development and verification testing have access to Lunar-like environments. The Marshall Space Flight Center (MSFC) is supporting this endeavor by developing, characterizing, and producing Lunar simulants in addition to analyzing existing simulants for appropriate applications. A Lunar Regolith Simulant Workshop was conducted by MSFC in Huntsville, Alabama, in October 2007. The purpose of the Workshop was to bring together simulant developers, simulant users, and program and project managers from ETDP and Constellation with the goals of understanding users' simulant needs and their applications. A status of current simulant developments such as the JSC-1A (Mare Type Simulant) and the NASA/U.S. Geological Survey Lunar Highlands-Type Pilot Simulant (NU-LHT-1M) was provided. The method for evaluating simulants, performed via Figures of Merit (FoMs) algorithms, was presented and a demonstration was provided. The four FoM properties currently being assessed are: size, shape, density, and composition. Some of the Workshop findings include: simulant developers must understand simulant users' needs and applications; higher fidelity simulants are needed and needed in larger quantities now; simulants must be characterized to allow "apples-to-apples" comparison of test results; simulant users should confer with simulant experts to assist them in the selection of simulants; safety precautions should be taken in the handling and use of simulants; shipping, storing, and preparation of simulants have important implications; and most importantly, close communications among the simulant community must be maintained and will be continued via telecoms, meetings, and an annual Lunar Regolith Simulant Workshop.

  14. 2007 Lunar Regolith Simulant Workshop Overview

    NASA Technical Reports Server (NTRS)

    McLemore, Carole A.; Fikes, John C.; Howell, Joe T.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) vision has as a cornerstone, the establishment of an Outpost on the Moon. This Lunar Outpost will eventually provide the necessary planning, technology development, and training for a manned mission to Mars in the future. As part of the overall activity, NASA is conducting Earth-based research and advancing technologies to a Technology Readiness Level (TRL) 6 maturity under the Exploration Technology Development Program that will be incorporated into the Constellation Project as well as other projects. All aspects of the Lunar environment, including the Lunar regolith and its properties, are important in understanding the long-term impacts to hardware, scientific instruments, and humans prior to returning to the Moon and living on the Moon. With the goal of reducing risk to humans and hardware and increasing mission success on the Lunar surface, it is vital that terrestrial investigations including both development and verification testing have access to Lunar-like environments. The Marshall Space Flight Center (MSFC) is supporting this endeavor by developing, characterizing, and producing Lunar simulants in addition to analyzing existing simulants for appropriate applications. A Lunar Regolith Simulant Workshop was conducted by MSFC in Huntsville, Alabama, in October 2007. The purpose of the Workshop was to bring together simulant developers, simulant users, and program and project managers from ETDP and Constellation with the goals of understanding users' simulant needs and their applications. A status of current simulant developments such as the JSC-1A (Mare Type Simulant) and the NASA/U.S. Geological Survey Lunar Highlands-Type Pilot Simulant (NU-LHT-1 M) was provided. The method for evaluating simulants, performed via Figures of Merit (FoMs) algorithms, was presented and a demonstration was provided. The four FoM properties currently being assessed are: size, shape, density, and composition. Some of the Workshop findings include: simulant developers must understand simulant users' needs and applications; higher fidelity simulants are needed and needed in larger quantities now; simulants must be characterized to allow "apples-to-apples" comparison of test results; simulant users should confer with simulant experts to assist them in the selection of simulants; safety precautions should be taken in the handling and use of simulants; shipping, storing, and preparation of simulants have important implications; and most importantly, close communications among the simulant community must be maintained and will be continued via telecoms, meetings, and an annual Lunar Regolith Simulant Workshop.

  15. INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT

    EPA Science Inventory

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...

  16. Planning a Study for Testing the Rasch Model given Missing Values due to the use of Test-booklets.

    PubMed

    Yanagida, Takuya; Kubinger, Klaus D; Rasch, Dieter

    2015-01-01

    Though calibration of an achievement test within a psychological and educational context is very often carried out by the Rasch model, data sampling is hardly designed according to statistical foundations. However, Kubinger, Rasch, and Yanagida (2009, 2011) suggested an approach for the determination of sample size according to a given Type-I and Type-II risk and a certain effect of model contradiction when testing the Rasch model. The approach uses a three-way analysis of variance design with mixed classification. For the while, their simulation studies deal with complete data, meaning every examinee is administered with all of the items of an item pool. The simulation study now presented in this paper deals with the practical relevant case, in particular for large-scale assessments, that item presentation happens to use several test-booklets. As a consequence, there are missing values by design. Therefore, the question to be considered is, whether this approach works in this case as well. Besides the fact, that data are not normally distributed but there is a dichotomous variable (an examinee either solves an item or fails to solve it), only a single entry for each cell exists in the given three-way analysis of variance design, if at all, due to missing values. Hence, the obligatory test-statistic's distribution may not be retained, in contrast to the case of having no missing values. The result of our simulation study, despite applying only to a very special scenario, is that this approach works, indeed: Whether test-booklets were used or every examinee is administered all of the items changes nothing in respect to the actual Type-I risk or to the power of the test, given almost the same amount of information of examinees per item. However, as the results are limited to a special scenario, we currently recommend any interested researcher to simulate the appropriate one in advance by him/herself.

  17. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  18. III: Use of biomarkers as Risk Indicators in Environmental Risk Assessment of oil based discharges offshore.

    PubMed

    Sanni, Steinar; Lyng, Emily; Pampanin, Daniela M

    2017-06-01

    Offshore oil and gas activities are required not to cause adverse environmental effects, and risk based management has been established to meet environmental standards. In some risk assessment schemes, Risk Indicators (RIs) are parameters to monitor the development of risk affecting factors. RIs have not yet been established in the Environmental Risk Assessment procedures for management of oil based discharges offshore. This paper evaluates the usefulness of biomarkers as RIs, based on their properties, existing laboratory biomarker data and assessment methods. Data shows several correlations between oil concentrations and biomarker responses, and assessment principles exist that qualify biomarkers for integration into risk procedures. Different ways that these existing biomarkers and methods can be applied as RIs in a probabilistic risk assessment system when linked with whole organism responses are discussed. This can be a useful approach to integrate biomarkers into probabilistic risk assessment related to oil based discharges, representing a potential supplement to information that biomarkers already provide about environmental impact and risk related to these kind of discharges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. How are flood risk estimates affected by the choice of return-periods?

    NASA Astrophysics Data System (ADS)

    Ward, P. J.; de Moel, H.; Aerts, J. C. J. H.

    2011-12-01

    Flood management is more and more adopting a risk based approach, whereby flood risk is the product of the probability and consequences of flooding. One of the most common approaches in flood risk assessment is to estimate the damage that would occur for floods of several exceedance probabilities (or return periods), to plot these on an exceedance probability-loss curve (risk curve) and to estimate risk as the area under the curve. However, there is little insight into how the selection of the return-periods (which ones and how many) used to calculate risk actually affects the final risk calculation. To gain such insights, we developed and validated an inundation model capable of rapidly simulating inundation extent and depth, and dynamically coupled this to an existing damage model. The method was applied to a section of the River Meuse in the southeast of the Netherlands. Firstly, we estimated risk based on a risk curve using yearly return periods from 2 to 10 000 yr (€ 34 million p.a.). We found that the overall risk is greatly affected by the number of return periods used to construct the risk curve, with over-estimations of annual risk between 33% and 100% when only three return periods are used. In addition, binary assumptions on dike failure can have a large effect (a factor two difference) on risk estimates. Also, the minimum and maximum return period considered in the curve affects the risk estimate considerably. The results suggest that more research is needed to develop relatively simple inundation models that can be used to produce large numbers of inundation maps, complementary to more complex 2-D-3-D hydrodynamic models. It also suggests that research into flood risk could benefit by paying more attention to the damage caused by relatively high probability floods.

  20. Lava flow risk maps at Mount Cameroon volcano

    NASA Astrophysics Data System (ADS)

    Favalli, M.; Fornaciai, A.; Papale, P.; Tarquini, S.

    2009-04-01

    Mount Cameroon, in the southwest Cameroon, is one of the most active volcanoes in Africa. Rising 4095 m asl, it has erupted nine times since the beginning of the past century, more recently in 1999 and 2000. Mount Cameroon documented eruptions are represented by moderate explosive and effusive eruptions occurred from both summit and flank vents. A 1922 SW-flank eruption produced a lava flow that reached the Atlantic coast near the village of Biboundi, and a lava flow from a 1999 south-flank eruption stopped only 200 m from the sea, threatening the villages of Bakingili and Dibunscha. More than 450,000 people live or work around the volcano, making the risk from lava flow invasion a great concern. In this work we propose both conventional hazard and risk maps and novel quantitative risk maps which relate vent locations to the expected total damage on existing buildings. These maps are based on lava flow simulations starting from 70,000 different vent locations, a probability distribution of vent opening, a law for the maximum length of lava flows, and a database of buildings. The simulations were run over the SRTM Digital Elevation Model (DEM) using DOWNFLOW, a fast DEM-driven model that is able to compute detailed invasion areas of lava flows from each vent. We present three different types of risk maps (90-m-pixel) for buildings around Mount Cameroon volcano: (1) a conventional risk map that assigns a probability of devastation by lava flows to each pixel representing buildings; (2) a reversed risk map where each pixel expresses the total damage expected as a consequence of vent opening in that pixel (the damage is expressed as the total surface of urbanized areas invaded); (3) maps of the lava catchments of the main towns around the volcano, within every catchment the pixels are classified according to the expected impact they might produce on the relative town in the case of a vent opening in that pixel. Maps of type (1) and (3) are useful for long term planning. Maps of type (2) and (3) are useful at the onset of a new eruption, when a vent forms. The combined use of these maps provides an efficient tool for lava flow risk assessment at Mount Cameroon.

  1. River flood risk in Jakarta under scenarios of future change

    NASA Astrophysics Data System (ADS)

    Budiyono, Yus; Aerts, Jeroen C. J. H.; Tollenaar, Daniel; Ward, Philip J.

    2016-03-01

    Given the increasing impacts of flooding in Jakarta, methods for assessing current and future flood risk are required. In this paper, we use the Damagescanner-Jakarta risk model to project changes in future river flood risk under scenarios of climate change, land subsidence, and land use change. Damagescanner-Jakarta is a simple flood risk model that estimates flood risk in terms of annual expected damage, based on input maps of flood hazard, exposure, and vulnerability. We estimate baseline flood risk at USD 186 million p.a. Combining all future scenarios, we simulate a median increase in risk of +180 % by 2030. The single driver with the largest contribution to that increase is land subsidence (+126 %). We simulated the impacts of climate change by combining two scenarios of sea level rise with simulations of changes in 1-day extreme precipitation totals from five global climate models (GCMs) forced by the four Representative Concentration Pathways (RCPs). The results are highly uncertain; the median change in risk due to climate change alone by 2030 is a decrease by -46 %, but we simulate an increase in risk under 12 of the 40 GCM-RCP-sea level rise combinations. Hence, we developed probabilistic risk scenarios to account for this uncertainty. If land use change by 2030 takes places according to the official Jakarta Spatial Plan 2030, risk could be reduced by 12 %. However, if land use change in the future continues at the same rate as the last 30 years, large increases in flood risk will take place. Finally, we discuss the relevance of the results for flood risk management in Jakarta.

  2. On the Need for Multidimensional Stirling Simulations

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2005-01-01

    Given the cost and complication of simulating Stirling convertors, do we really need multidimensional modeling when one-dimensional capabilities exist? This paper provides a comprehensive description of when and why multidimensional simulation is needed.

  3. Surrogate safety measures from traffic simulation models

    DOT National Transportation Integrated Search

    2003-01-01

    This project investigates the potential for deriving surrogate measures of safety from existing microscopic traffic simulation models for intersections. The process of computing the measures in the simulation, extracting the required data, and summar...

  4. A review of training research and virtual reality simulators for the da Vinci surgical system.

    PubMed

    Liu, May; Curet, Myriam

    2015-01-01

    PHENOMENON: Virtual reality simulators are the subject of several recent studies of skills training for robot-assisted surgery. Yet no consensus exists regarding what a core skill set comprises or how to measure skill performance. Defining a core skill set and relevant metrics would help surgical educators evaluate different simulators. This review draws from published research to propose a core technical skill set for using the da Vinci surgeon console. Publications on three commercial simulators were used to evaluate the simulators' content addressing these skills and associated metrics. An analysis of published research suggests that a core technical skill set for operating the surgeon console includes bimanual wristed manipulation, camera control, master clutching to manage hand position, use of third instrument arm, activating energy sources, appropriate depth perception, and awareness of forces applied by instruments. Validity studies of three commercial virtual reality simulators for robot-assisted surgery suggest that all three have comparable content and metrics. However, none have comprehensive content and metrics for all core skills. INSIGHTS: Virtual reality simulation remains a promising tool to support skill training for robot-assisted surgery, yet existing commercial simulator content is inadequate for performing and assessing a comprehensive basic skill set. The results of this evaluation help identify opportunities and challenges that exist for future developments in virtual reality simulation for robot-assisted surgery. Specifically, the inclusion of educational experts in the development cycle alongside clinical and technological experts is recommended.

  5. Analogs of microgravity: head-down tilt and water immersion.

    PubMed

    Watenpaugh, Donald E

    2016-04-15

    This article briefly reviews the fidelity of ground-based methods used to simulate human existence in weightlessness (spaceflight). These methods include horizontal bed rest (BR), head-down tilt bed rest (HDT), head-out water immersion (WI), and head-out dry immersion (DI; immersion with an impermeable elastic cloth barrier between subject and water). Among these, HDT has become by far the most commonly used method, especially for longer studies. DI is less common but well accepted for long-duration studies. Very few studies exist that attempt to validate a specific simulation mode against actual microgravity. Many fundamental physical, and thus physiological, differences exist between microgravity and our methods to simulate it, and between the different methods. Also, although weightlessness is the salient feature of spaceflight, several ancillary factors of space travel complicate Earth-based simulation. In spite of these discrepancies and complications, the analogs duplicate many responses to 0 G reasonably well. As we learn more about responses to microgravity and spaceflight, investigators will continue to fine-tune simulation methods to optimize accuracy and applicability. Copyright © 2016 the American Physiological Society.

  6. The Impact of Distraction on the Driving Performance of Adolescents with and without Attention Deficit Hyperactivity Disorder

    PubMed Central

    Narad, Megan; Garner, Annie A.; Brassell, Anne A.; Saxby, Dyani; Antonini, Tanya N.; O'Brien, Kathleen M.; Tamm, Leanne; Matthews, Gerald; Epstein, Jeffery N.

    2013-01-01

    Importance This study extends the literature regarding Attention-Deficit/Hyperactivity Disorder (ADHD) related driving impairments to a newly-licensed, adolescent population. Objective To investigate the combined risks of adolescence, ADHD, and distracted driving (cell phone conversation and text messaging) on driving performance. Design Adolescents with and without ADHD engaged in a simulated drive under three conditions (no distraction, cell phone conversation, texting). During each condition, one unexpected event (e.g., car suddenly merging into driver's lane) was introduced. Setting Driving simulator. Participants Adolescents aged 16–17 with ADHD (n=28) and controls (n=33). Interventions/Main Exposures Cell phone conversation, texting, and no distraction while driving. Outcome Measures Self-report of driving history; Average speed, standard deviation of speed, standard deviation of lateral position, braking reaction time during driving simulation. Results Adolescents with ADHD reported fewer months of driving experience and a higher proportion of driving violations than controls. After controlling for months of driving history, adolescents with ADHD demonstrated more variability in speed and lane position than controls. There were no group differences for braking reaction time. Further, texting negatively impacted the driving performance of all participants as evidenced by increased variability in speed and lane position. Conclusions This study, one of the first to investigate distracted driving in adolescents with ADHD, adds to a growing body of literature documenting that individuals with ADHD are at increased risk for negative driving outcomes. Furthermore, texting significantly impairs the driving performance of all adolescents and increases existing driving-related impairment in adolescents with ADHD, highlighting the need for education and enforcement of regulations against texting for this age group. PMID:23939758

  7. A Medical Interviewing Curriculum Intervention for Medical Students' Assessment of Suicide Risk

    ERIC Educational Resources Information Center

    Fiedorowicz, Jess G.; Tate, Jodi; Miller, Anthony C.; Franklin, Ellen M.; Gourley, Ryan; Rosenbaum, Marcy

    2013-01-01

    Objective: Effective communication strategies are required to assess suicide risk. The authors determined whether a 2-hour simulated-patient activity during a psychiatry clerkship improved self-assessment of medical interviewing skills relevant to suicide risk-assessment. Methods: In the 2-hour simulated-patient intervention, at least one…

  8. Simulating the Risk of Investment in Human Capital

    ERIC Educational Resources Information Center

    Hartog, Joop; Van Ophem, Hans; Bajdechi, Simona Maria

    2007-01-01

    The risk of investment in schooling has largely been ignored. We mimic the investment decision facing a student and simulate risky earnings profiles in alternative options, with parameters taken from the very limited evidence. The distribution of rates of return appears positively skewed. Our best estimate of "ex ante" risk in university education…

  9. 33 CFR 83.07 - Risk of collision (Rule 7).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... exists. If there is any doubt such risk shall be deemed to exist. (b) Radar. Proper use shall be made of radar equipment if fitted and operational, including long-range scanning to obtain early warning of risk of collision and radar plotting or equivalent systematic observation of detected objects. (c) Scanty...

  10. 33 CFR 83.07 - Risk of collision (Rule 7).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... exists. If there is any doubt such risk shall be deemed to exist. (b) Radar. Proper use shall be made of radar equipment if fitted and operational, including long-range scanning to obtain early warning of risk of collision and radar plotting or equivalent systematic observation of detected objects. (c) Scanty...

  11. 33 CFR 83.07 - Risk of collision (Rule 7).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... exists. If there is any doubt such risk shall be deemed to exist. (b) Radar. Proper use shall be made of radar equipment if fitted and operational, including long-range scanning to obtain early warning of risk of collision and radar plotting or equivalent systematic observation of detected objects. (c) Scanty...

  12. 33 CFR 83.07 - Risk of collision (Rule 7).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... exists. If there is any doubt such risk shall be deemed to exist. (b) Radar. Proper use shall be made of radar equipment if fitted and operational, including long-range scanning to obtain early warning of risk of collision and radar plotting or equivalent systematic observation of detected objects. (c) Scanty...

  13. 33 CFR 83.07 - Risk of collision (Rule 7).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... exists. If there is any doubt such risk shall be deemed to exist. (b) Radar. Proper use shall be made of radar equipment if fitted and operational, including long-range scanning to obtain early warning of risk of collision and radar plotting or equivalent systematic observation of detected objects. (c) Scanty...

  14. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  15. Simulation of flashing signal operations.

    DOT National Transportation Integrated Search

    1982-01-01

    Various guidelines that have been proposed for the operation of traffic signals in the flashing mode were reviewed. The use of existing traffic simulation procedures to evaluate flashing signals was examined and a study methodology for simulating and...

  16. Assessment on the pedestrian risk during floods based on numerical simulation - A case study in Jinan City

    NASA Astrophysics Data System (ADS)

    Cheng, T.; Xu, Z.; Hong, S.

    2017-12-01

    Flood disasters frequently attack the urban area in Jinan City during past years, and the city is faced with severe road flooding which greatly threaten pedestrians' safety. Therefore, it is of great significance to investigate the pedestrian risk during floods under specific topographic condition. In this study, a model coupled hydrological and hydrodynamic processes is developed in the study area to simulate the flood routing process on the road for the "7.18" rainstorm and validated with post-disaster damage survey information. The risk of pedestrian is estimated with a flood risk assessment model. The result shows that the coupled model performs well in the rainstorm flood process. On the basis of the simulation result, the areas with extreme risk, medium risk, and mild risk are identified, respectively. Regions with high risk are generally located near the mountain front area with steep slopes. This study will provide scientific support for the flood control and disaster reduction in Jinan City.

  17. Application of wildfire simulation models for risk analysis

    Treesearch

    Alan A. Ager; Mark A. Finney

    2009-01-01

    Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of...

  18. Multi-Agent Social Simulation

    NASA Astrophysics Data System (ADS)

    Noda, Itsuki; Stone, Peter; Yamashita, Tomohisa; Kurumatani, Koichi

    While ambient intelligence and smart environments (AISE) technologies are expected to provide large impacts to human lives and social activities, it is generally difficult to show utilities and effects of these technologies on societies. AISE technologies are not only methods to improve performance and functionality of existing services in the society, but also frameworks to introduce new systems and services to the society. For example, no one expected beforehand what Internet or mobile phone brought into out social activities and services, although they changes our social system and patterns of behaviors drastically and emerge new services (and risks, unfortunately). The main reason of this difficulty is that actual effects of IT systems appear when enough number of people in the society use the technologies.

  19. Hydrogen sulfide emission in sewer networks: a two-phase modeling approach to the sulfur cycle.

    PubMed

    Yongsiri, C; Vollertsen, J; Hvitved-Jacobsen, T

    2004-01-01

    Wherever transport of anaerobic wastewater occurs, potential problems associated with hydrogen sulfide in relation to odor nuisance, health risk and corrosion exist. Improved understanding of prediction of hydrogen sulfide emission into the sewer atmosphere is needed for better evaluation of such problems in sewer networks. A two-phase model for emission of hydrogen sulfide along stretches of gravity sewers is presented to estimate the occurrence of both sulfide in the water phase and hydrogen sulfide in the sewer atmosphere. The model takes into account air-water mass transfer of hydrogen sulfide and interactions with other processes in the sulfur cycle. Various emission scenarios are simulated to illustrate the release characteristics of hydrogen sulfide.

  20. Visualizing the Heterogeneity of Effects in the Analysis of Associations of Multiple Myeloma with Glyphosate Use. Comments on Sorahan, T. Multiple Myeloma and Glyphosate Use: A Re-Analysis of US Agricultural Health Study (AHS) Data. Int. J. Environ. Res. Public Health 2015, 12, 1548-1559.

    PubMed

    Burstyn, Igor; De Roos, Anneclaire J

    2016-12-22

    We address a methodological issue of the evaluation of the difference in effects in epidemiological studies that may arise, for example, from stratum-specific analyses or differences in analytical decisions during data analysis. We propose a new simulation-based method to quantify the plausible extent of such heterogeneity, rather than testing a hypothesis about its existence. We examine the contribution of the method to the debate surrounding risk of multiple myeloma and glyphosate use and propose that its application contributes to a more balanced weighting of evidence.

  1. Visualizing the Heterogeneity of Effects in the Analysis of Associations of Multiple Myeloma with Glyphosate Use. Comments on Sorahan, T. Multiple Myeloma and Glyphosate Use: A Re-Analysis of US Agricultural Health Study (AHS) Data. Int. J. Environ. Res. Public Health 2015, 12, 1548–1559

    PubMed Central

    Burstyn, Igor; De Roos, Anneclaire J.

    2016-01-01

    We address a methodological issue of the evaluation of the difference in effects in epidemiological studies that may arise, for example, from stratum-specific analyses or differences in analytical decisions during data analysis. We propose a new simulation-based method to quantify the plausible extent of such heterogeneity, rather than testing a hypothesis about its existence. We examine the contribution of the method to the debate surrounding risk of multiple myeloma and glyphosate use and propose that its application contributes to a more balanced weighting of evidence. PMID:28025514

  2. Diagnostic evaluation of the Community Earth System Model in simulating mineral dust emission with insight into large-scale dust storm mobilization in the Middle East and North Africa (MENA)

    NASA Astrophysics Data System (ADS)

    Parajuli, Sagar Prasad; Yang, Zong-Liang; Lawrence, David M.

    2016-06-01

    Large amounts of mineral dust are injected into the atmosphere during dust storms, which are common in the Middle East and North Africa (MENA) where most of the global dust hotspots are located. In this work, we present simulations of dust emission using the Community Earth System Model Version 1.2.2 (CESM 1.2.2) and evaluate how well it captures the spatio-temporal characteristics of dust emission in the MENA region with a focus on large-scale dust storm mobilization. We explicitly focus our analysis on the model's two major input parameters that affect the vertical mass flux of dust-surface winds and the soil erodibility factor. We analyze dust emissions in simulations with both prognostic CESM winds and with CESM winds that are nudged towards ERA-Interim reanalysis values. Simulations with three existing erodibility maps and a new observation-based erodibility map are also conducted. We compare the simulated results with MODIS satellite data, MACC reanalysis data, AERONET station data, and CALIPSO 3-d aerosol profile data. The dust emission simulated by CESM, when driven by nudged reanalysis winds, compares reasonably well with observations on daily to monthly time scales despite CESM being a global General Circulation Model. However, considerable bias exists around known high dust source locations in northwest/northeast Africa and over the Arabian Peninsula where recurring large-scale dust storms are common. The new observation-based erodibility map, which can represent anthropogenic dust sources that are not directly represented by existing erodibility maps, shows improved performance in terms of the simulated dust optical depth (DOD) and aerosol optical depth (AOD) compared to existing erodibility maps although the performance of different erodibility maps varies by region.

  3. Effect of Blast-Induced Vibration from New Railway Tunnel on Existing Adjacent Railway Tunnel in Xinjiang, China

    NASA Astrophysics Data System (ADS)

    Liang, Qingguo; Li, Jie; Li, Dewu; Ou, Erfeng

    2013-01-01

    The vibrations of existing service tunnels induced by blast-excavation of adjacent tunnels have attracted much attention from both academics and engineers during recent decades in China. The blasting vibration velocity (BVV) is the most widely used controlling index for in situ monitoring and safety assessment of existing lining structures. Although numerous in situ tests and simulations had been carried out to investigate blast-induced vibrations of existing tunnels due to excavation of new tunnels (mostly by bench excavation method), research on the overall dynamical response of existing service tunnels in terms of not only BVV but also stress/strain seemed limited for new tunnels excavated by the full-section blasting method. In this paper, the impacts of blast-induced vibrations from a new tunnel on an existing railway tunnel in Xinjiang, China were comprehensively investigated by using laboratory tests, in situ monitoring and numerical simulations. The measured data from laboratory tests and in situ monitoring were used to determine the parameters needed for numerical simulations, and were compared with the calculated results. Based on the results from in situ monitoring and numerical simulations, which were consistent with each other, the original blasting design and corresponding parameters were adjusted to reduce the maximum BVV, which proved to be effective and safe. The effect of both the static stress before blasting vibrations and the dynamic stress induced by blasting on the total stresses in the existing tunnel lining is also discussed. The methods and related results presented could be applied in projects with similar ground and distance between old and new tunnels if the new tunnel is to be excavated by the full-section blasting method.

  4. Modeling the Environmental Impact of Air Traffic Operations

    NASA Technical Reports Server (NTRS)

    Chen, Neil

    2011-01-01

    There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.

  5. Reconstructing gravitational wave source parameters via direct comparisons to numerical relativity I: Method

    NASA Astrophysics Data System (ADS)

    Lange, Jacob; O'Shaughnessy, Richard; Healy, James; Lousto, Carlos; Shoemaker, Deirdre; Lovelace, Geoffrey; Scheel, Mark; Ossokine, Serguei

    2016-03-01

    In this talk, we describe a procedure to reconstruct the parameters of sufficiently massive coalescing compact binaries via direct comparison with numerical relativity simulations. For sufficiently massive sources, existing numerical relativity simulations are long enough to cover the observationally accessible part of the signal. Due to the signal's brevity, the posterior parameter distribution it implies is broad, simple, and easily reconstructed from information gained by comparing to only the sparse sample of existing numerical relativity simulations. We describe how followup simulations can corroborate and improve our understanding of a detected source. Since our method can include all physics provided by full numerical relativity simulations of coalescing binaries, it provides a valuable complement to alternative techniques which employ approximations to reconstruct source parameters. Supported by NSF Grant PHY-1505629.

  6. BeeSim: Leveraging Wearable Computers in Participatory Simulations with Young Children

    ERIC Educational Resources Information Center

    Peppler, Kylie; Danish, Joshua; Zaitlen, Benjamin; Glosson, Diane; Jacobs, Alexander; Phelps, David

    2010-01-01

    New technologies have enabled students to become active participants in computational simulations of dynamic and complex systems (called Participatory Simulations), providing a "first-person"perspective on complex systems. However, most existing Participatory Simulations have targeted older children, teens, and adults assuming that such concepts…

  7. Digital Simulation and Modelling.

    ERIC Educational Resources Information Center

    Hawthorne, G. B., Jr.

    A basically tutorial point of view is taken in this general discussion. The author examines the basic concepts and principles of simulation and modelling and the application of digital computers to these tasks. Examples of existing simulations, a discussion of the applicability and feasibility of simulation studies, a review of simulation…

  8. Measurement of the influences relating to anthropization on the temporal evolution of the gravitational risks and the vulnerability.

    NASA Astrophysics Data System (ADS)

    Lebourg, T.; Llop, R.; Provitolo, D.; Allignol, F.; Zerathe, S.

    2009-04-01

    The objective of this paper is to show the impact of the instrumentation on an urban area on the principle of prevention of the landslides risk and thus to contribute to decrease the vulnerability for an urban long-term future development. We show that the analyze by instrumentation of triggered factors which characterize the risk (by the quantification of the evolution in time of the mechanical properties versus weathered processes) suggest that it exists a relation between "susceptibility of landslides" and urban development The evolution of the stakes during time is at the same time, factor of evolution of the susceptibility and triggered factor of the vulnerability evolution of urban areas. The scientific goal relates to the urban systems vulnerability and resilience modelling versus landslides processes for the assistance to the risks prevention. Indeed, the installation of an effective risks prevention policy is based on a good evaluation of the intensity, the period of return of the phenomena and their zone of expansion, but also on an identification of the sectors exposed to the risks, their vulnerability and their resilience. The strategy of prevention of the risks generally relates to the construction of fortifications to protect the society but it can also be founded on the resilience concept. This other approach is not opposed to the risk, but proposes to reduce the impacts. The anthroposysteme concept of makes it possible to take into accounts the determining role played by the human society in the space system evolution; natural and social systems associated on a given territory. The study of a space system passes then by the identification of components of the physical world (natural) and the living world (social), these two components forming integral part of the Society. To be concluded, this paper and study applies to the Mediterranean coastline anthroposystemes (northern bank) where urban growth, saturation of the littorals, constructions in danger zone, dynamic of risks and vulnerabilities are strongly overlapping. The town of Grasse in the Maritimes-Alps (France) is more particularly retained for this study. This choice is not trivial. The studied sector cumulates two important characteristics (I) the urbanization was made on slopes higher than 10/20° in an unfavourable geological context (urbanization, risk and vulnerability are thus in interaction) (II) an important demographic expansion passed and to come. The Geographical Information System (ArcGis) will be common support of this study which materialize simulations, observations and results of instrumentations carried out on tests sites, but also the landslides models of simulation.

  9. Enhancing Earth Observation and Modeling for Tsunami Disaster Response and Management

    NASA Astrophysics Data System (ADS)

    Koshimura, Shunichi; Post, Joachim

    2017-04-01

    In the aftermath of catastrophic natural disasters, such as earthquakes and tsunamis, our society has experienced significant difficulties in assessing disaster impact in the limited amount of time. In recent years, the quality of satellite sensors and access to and use of satellite imagery and services has greatly improved. More and more space agencies have embraced data-sharing policies that facilitate access to archived and up-to-date imagery. Tremendous progress has been achieved through the continuous development of powerful algorithms and software packages to manage and process geospatial data and to disseminate imagery and geospatial datasets in near-real time via geo-web-services, which can be used in disaster-risk management and emergency response efforts. Satellite Earth observations now offer consistent coverage and scope to provide a synoptic overview of large areas, repeated regularly. These can be used to compare risk across different countries, day and night, in all weather conditions, and in trans-boundary areas. On the other hand, with use of modern computing power and advanced sensor networks, the great advances of real-time simulation have been achieved. The data and information derived from satellite Earth observations, integrated with in situ information and simulation modeling provides unique value and the necessary complement to socio-economic data. Emphasis also needs to be placed on ensuring space-based data and information are used in existing and planned national and local disaster risk management systems, together with other data and information sources as a way to strengthen the resilience of communities. Through the case studies of the 2011 Great East Japan earthquake and tsunami disaster, we aim to discuss how earth observations and modeling, in combination with local, in situ data and information sources, can support the decision-making process before, during and after a disaster strikes.

  10. Wildfire air pollution hazard during the 21st century

    NASA Astrophysics Data System (ADS)

    Knorr, Wolfgang; Dentener, Frank; Lamarque, Jean-François; Jiang, Leiwen; Arneth, Almut

    2017-07-01

    Wildfires pose a significant risk to human livelihoods and are a substantial health hazard due to emissions of toxic smoke. Previous studies have shown that climate change, increasing atmospheric CO2, and human demographic dynamics can lead to substantially altered wildfire risk in the future, with fire activity increasing in some regions and decreasing in others. The present study re-examines these results from the perspective of air pollution risk, focussing on emissions of airborne particulate matter (PM2. 5), combining an existing ensemble of simulations using a coupled fire-dynamic vegetation model with current observation-based estimates of wildfire emissions and simulations with a chemical transport model. Currently, wildfire PM2. 5 emissions exceed those from anthropogenic sources in large parts of the world. We further analyse two extreme sets of future wildfire emissions in a socio-economic, demographic climate change context and compare them to anthropogenic emission scenarios reflecting current and ambitious air pollution legislation. In most regions of the world, ambitious reductions of anthropogenic air pollutant emissions have the potential to limit mean annual pollutant PM2. 5 levels to comply with World Health Organization (WHO) air quality guidelines for PM2. 5. Worst-case future wildfire emissions are not likely to interfere with these annual goals, largely due to fire seasonality, as well as a tendency of wildfire sources to be situated in areas of intermediate population density, as opposed to anthropogenic sources that tend to be highest at the highest population densities. However, during the high-fire season, we find many regions where future PM2. 5 pollution levels can reach dangerous levels even for a scenario of aggressive reduction of anthropogenic emissions.

  11. Mechanical characteristics of plastic base Ports and impact on flushing efficacy.

    PubMed

    Guiffant, Gérard; Flaud, Patrice; Royon, Laurent; Burnet, Espérie; Merckx, Jacques

    2017-01-01

    Three types of totally implantable venous access devices, Ports, are currently in use: titanium, plastic (polyoxymethylene, POM), and mixed (titanium base with a POM shell). Physics theory suggests that the interaction between a non-coring needle (NCN, made of stainless steel) and a plastic base would lead to the stronger material (steel) altering the more malleable material (plastic). To investigate whether needle impacts can alter a plastic base's surface, thus potentially reducing flushing efficacy. A Port made of POM was punctured 200 times with a 19-gauge NCN. Following the existing guidelines, the needle tip pricked the base with each puncture. The Port's base was then examined using a two-dimensional optical instrument, and a bi-dimensional numerical simulation using COMSOL ® was performed to investigate potential surface irregularities and their impact on fluid flow. Each needle impact created a hole (mean depth, 0.12 mm) with a small bump beside it (mean height, 0.02 mm) the Reynolds number Re k ≈10. A numerical simulation of the one hole/bump set showed that the flushing efficacy was 60% that of flushing along a flat surface. In clinical practice, the number of times a Port is punctured depends on patient and treatment characteristics, but each needle impact on the plastic base may increase the risk of decreased flushing effectiveness. Therefore, the more a plastic Port is accessed, the greater the risk of microorganisms, blood products, and medication accumulation. Multiple needle impacts created an irregular surface on the Port's base, which decreased flushing efficacy. Clinical investigation is needed to determine whether plastic base Ports are associated with an increased risk of Port infection and occlusion compared to titanium base Ports.

  12. Assessment of potential environmental risks of transgene flow in smallholder farming systems in Asia: Brassica napus as a case study in Korea.

    PubMed

    Zhang, Chuan-Jie; Yook, Min-Jung; Park, Hae-Rim; Lim, Soo-Hyun; Kim, Jin-Won; Nah, Gyoungju; Song, Hae-Ryong; Jo, Beom-Ho; Roh, Kyung Hee; Park, Suhyoung; Kim, Do-Soon

    2018-06-02

    The cultivation of genetically modified (GM) crops has raised many questions regarding their environmental risks, particularly about their ecological impact on non-target organisms, such as their closely-related relative species. Although evaluations of transgene flow from GM crops to their conventional crops has been conducted under large-scale farming system worldwide, in particular in North America and Australia, few studies have been conducted under smallholder farming systems in Asia with diverse crops in co-existence. A two-year field study was conducted to assess the potential environmental risks of gene flow from glufosinate-ammonium resistant (GR) Brassica napus to its conventional relatives, B. napus, B. juncea, and Raphanus sativus under simulated smallholder field conditions in Korea. Herbicide resistance and simple sequence repeat (SSR) markers were used to identify the hybrids. Hybridization frequency of B. napus × GR B. napus was 2.33% at a 2 m distance, which decreased to 0.007% at 75 m. For B. juncea, it was 0.076% at 2 m and decreased to 0.025% at 16 m. No gene flow was observed to R. sativus. The log-logistic model described hybridization frequency with increasing distance from GR B. napus to B. napus and B. juncea and predicted that the effective isolation distances for 0.01% gene flow from GR B. napus to B. napus and B. juncea were 122.5 and 23.7 m, respectively. Results suggest that long-distance gene flow from GR B. napus to B. napus and B. juncea is unlikely, but gene flow can potentially occur between adjacent fields where the smallholder farming systems exist. Copyright © 2018. Published by Elsevier B.V.

  13. A pre-dam-removal assessment of sediment transport for four dams on the Kalamazoo River between Plainwell and Allegan, Michigan

    USGS Publications Warehouse

    Syed, Atiq U.; Bennett, James P.; Rachol, Cynthia M.

    2005-01-01

    Four dams on the Kalamazoo River between the cities of Plainwell and Allegan, Mich., are in varying states of disrepair. The Michigan Department of Environmental Quality (MDEQ) and U.S. Environmental Protection Agency (USEPA) are considering removing these dams to restore the river channels to pre-dam conditions. This study was initiated to identify sediment characteristics, monitor sediment transport, and predict sediment resuspension and deposition under varying hydraulic conditions. The mathematical model SEDMOD was used to simulate streamflow and sediment transport using three modeling scenarios: (1) sediment transport simulations for 730 days (Jan. 2001 to Dec. 2002), with existing dam structures, (2) sediment transport simulations based on flows from the 1947 flood at the Kalamazoo River with existing dam structures, and (3) sediment transport simulations based on flows from the 1947 flood at the Kalamazoo River with dams removed. Sediment transport simulations based on the 1947 flood hydrograph provide an estimate of sediment transport rates under maximum flow conditions. These scenarios can be used as an assessment of the sediment load that may erode from the study reach at this flow magnitude during a dam failure. The model was calibrated using suspended sediment as a calibration parameter and root mean squared error (RMSE) as an objective function. Analyses of the calibrated model show a slight bias in the model results at flows higher than 75 m3/s; this means that the model-simulated suspended-sediment transport rates are higher than the observed rates; however, the overall calibrated model results show close agreement between simulated and measured values of suspended sediment. Simulation results show that the Kalamazoo River sediment transport mechanism is in a dynamic equilibrium state. Model results during the 730-day simulations indicate significant sediment erosion from the study reach at flow rates higher than 55 m3/s. Similarly, significant sediment deposition occurs during low to average flows (monthly mean flows between 25.49 m3/s and 50.97 m3/s) after a high-flow event. If the flow continues to stay in the low to average range the system shifts towards equilibrium, resulting in a balancing effect between sediment deposition and erosion rates. The 1947 flood-flow simulations show approximately 30,000 m3 more instream sediments erosion for the first 21 days of the dams removed scenario than for the existing-dams scenario, with the same initial conditions for both scenarios. Application of a locally weighted regression smoothing (LOWESS) function to simulation results of the dams removed scenario indicates a steep downtrend with high sediment transport rates during the first 21 days. In comparison, the LOWESS curve for the existing-dams scenario shows a smooth transition of sediment transport rates in response to the change in streamflow. The high erosion rates during the dams-removed scenario are due to the absence of the dams; in contrast, the presence of dams in the existing-dams scenario helps reduce sediment erosion to some extent. The overall results of 60-day simulations for the 1947 flood show no significant difference in total volume of eroded sediment between the two scenarios, because the dams in the study reach have low heads and no control gates. It is important to note that the existing-dams and dams-removed scenarios simulations are run for only 60 days; therefore, the simulations take into account the changes in sediment erosion and deposition rates only during that time period. Over an extended period, more erosion of instream sediments would be expected to occur if the dams are not properly removed than under the existing conditions. On the basis of model simulations, removal of dams would further lower the head in all the channels. This lowering of head could produce higher flow velocities in the study reach, which ultimately would result in accelerated erosion rates.

  14. Numerical simulation of groundwater flow for the Yakima River basin aquifer system, Washington

    USGS Publications Warehouse

    Ely, D.M.; Bachmann, M.P.; Vaccaro, J.J.

    2011-01-01

    Five applications (scenarios) of the model were completed to obtain a better understanding of the relation between pumpage and surface-water resources and groundwater levels. For the first three scenarios, the calibrated transient model was used to simulate conditions without: (1) pumpage from all hydrogeologic units, (2) pumpage from basalt hydrogeologic units, and (3) exempt-well pumpage. The simulation results indicated potential streamflow capture by the existing pumpage from 1960 through 2001. The quantity of streamflow capture generally was inversely related to the total quantity of pumpage eliminated in the model scenarios. For the fourth scenario, the model simulated 1994 through 2001 under existing conditions with additional pumpage estimated for pending groundwater applications. The differences between the calibrated model streamflow and this scenario indicated additional decreases in streamflow of 91 cubic feet per second in the model domain. Existing conditions representing 1994 through 2001 were projected through 2025 for the fifth scenario and indicated additional streamflow decreases of 38 cubic feet per second and groundwater-level declines.

  15. Inspections of Interstate Commercial Vehicles 1994

    DOT National Transportation Integrated Search

    1974-01-01

    The objective of this effort was to complete the development of the computer simulation model SCOT (Simulation of Corridor Traffic) designed to represent traffic flow on an urban grid-freeway integrated highway system by simulating an existing system...

  16. The New England Climate Adaptation Project: Enhancing Local Readiness to Adapt to Climate Change through Role-Play Simulations

    NASA Astrophysics Data System (ADS)

    Rumore, D.; Kirshen, P. H.; Susskind, L.

    2014-12-01

    Despite scientific consensus that the climate is changing, local efforts to prepare for and manage climate change risks remain limited. How we can raise concern about climate change risks and enhance local readiness to adapt to climate change's effects? In this presentation, we will share the lessons learned from the New England Climate Adaptation Project (NECAP), a participatory action research project that tested science-based role-play simulations as a tool for educating the public about climate change risks and simulating collective risk management efforts. NECAP was a 2-year effort involving the Massachusetts Institute of Technology, the Consensus Building Institute, the National Estuarine Research Reserve System, and four coastal New England municipalities. During 2012-2013, the NECAP team produced downscaled climate change projections, a summary risk assessment, and a stakeholder assessment for each partner community. Working with local partners, we used these assessments to create a tailored, science-based role-play simulation for each site. Through a series of workshops in 2013, NECAP engaged between 115-170 diverse stakeholders and members of the public in each partner municipality in playing the simulation and a follow up conversation about local climate change risks and possible adaptation strategies. Data were collected through before-and-after surveys administered to all workshop participants, follow-up interviews with 25 percent of workshop participants, public opinion polls conducted before and after our intervention, and meetings with public officials. This presentation will report our research findings and explain how science-based role-play simulations can be used to help communicate local climate change risks and enhance local readiness to adapt.

  17. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    NASA Astrophysics Data System (ADS)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  18. Dr.LiTHO: a development and research lithography simulator

    NASA Astrophysics Data System (ADS)

    Fühner, Tim; Schnattinger, Thomas; Ardelean, Gheorghe; Erdmann, Andreas

    2007-03-01

    This paper introduces Dr.LiTHO, a research and development oriented lithography simulation environment developed at Fraunhofer IISB to flexibly integrate our simulation models into one coherent platform. We propose a light-weight approach to a lithography simulation environment: The use of a scripting (batch) language as an integration platform. Out of the great variety of different scripting languages, Python proved superior in many ways: It exhibits a good-natured learning-curve, it is efficient, available on virtually any platform, and provides sophisticated integration mechanisms for existing programs. In this paper, we will describe the steps, required to provide Python bindings for existing programs and to finally generate an integrated simulation environment. In addition, we will give a short introduction into selected software design demands associated with the development of such a framework. We will especially focus on testing and (both technical and user-oriented) documentation issues. Dr.LiTHO Python files contain not only all simulation parameter settings but also the simulation flow, providing maximum flexibility. In addition to relatively simple batch jobs, repetitive tasks can be pooled in libraries. And as Python is a full-blown programming language, users can add virtually any functionality, which is especially useful in the scope of simulation studies or optimization tasks, that often require masses of evaluations. Furthermore, we will give a short overview of the numerous existing Python packages. Several examples demonstrate the feasibility and productiveness of integrating Python packages into custom Dr.LiTHO scripts.

  19. The Generation of a Stochastic Flood Event Catalogue for Continental USA

    NASA Astrophysics Data System (ADS)

    Quinn, N.; Wing, O.; Smith, A.; Sampson, C. C.; Neal, J. C.; Bates, P. D.

    2017-12-01

    Recent advances in the acquisition of spatiotemporal environmental data and improvements in computational capabilities has enabled the generation of large scale, even global, flood hazard layers which serve as a critical decision-making tool for a range of end users. However, these datasets are designed to indicate only the probability and depth of inundation at a given location and are unable to describe the likelihood of concurrent flooding across multiple sites.Recent research has highlighted that although the estimation of large, widespread flood events is of great value to flood mitigation and insurance industries, to date it has been difficult to deal with this spatial dependence structure in flood risk over relatively large scales. Many existing approaches have been restricted to empirical estimates of risk based on historic events, limiting their capability of assessing risk over the full range of plausible scenarios. Therefore, this research utilises a recently developed model-based approach to describe the multisite joint distribution of extreme river flows across continental USA river gauges. Given an extreme event at a site, the model characterises the likelihood neighbouring sites are also impacted. This information is used to simulate an ensemble of plausible synthetic extreme event footprints from which flood depths are extracted from an existing global flood hazard catalogue. Expected economic losses are then estimated by overlaying flood depths with national datasets defining asset locations, characteristics and depth damage functions. The ability of this approach to quantify probabilistic economic risk and rare threshold exceeding events is expected to be of value to those interested in the flood mitigation and insurance sectors.This work describes the methodological steps taken to create the flood loss catalogue over a national scale; highlights the uncertainty in the expected annual economic vulnerability within the USA from extreme river flows; and presents future developments to the modelling approach.

  20. SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE

    EPA Science Inventory

    The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...

  1. Analysis of the Effectiveness of F-15E Risk Management during Peacetime Operations

    DTIC Science & Technology

    2015-06-18

    of aircraft or life . These results were compared with existing risk management programs in the form of unit worksheet assessments. This study found...Force risk management program across all fixed-wing aircraft. Rotary wing aircraft will have their own unique challenges . However, for all its...the loss of aircraft or life . These results were compared with existing risk management programs in the form of unit worksheet assessments. This

  2. Hierarchical Task Network Prototyping In Unity3d

    DTIC Science & Technology

    2016-06-01

    visually debug. Here we present a solution for prototyping HTNs by extending an existing commercial implementation of Behavior Trees within the Unity3D game ...HTN, dynamic behaviors, behavior prototyping, agent-based simulation, entity-level combat model, game engine, discrete event simulation, virtual...commercial implementation of Behavior Trees within the Unity3D game engine prior to building the HTN in COMBATXXI. Existing HTNs were emulated within

  3. The mechanical design and simulation of a scaled H⁻ Penning ion source.

    PubMed

    Rutter, T; Faircloth, D; Turner, D; Lawrie, S

    2016-02-01

    The existing ISIS Penning H(-) source is unable to produce the beam parameters required for the front end test stand and so a new, high duty factor, high brightness scaled source is being developed. This paper details first the development of an electrically biased aperture plate for the existing ISIS source and second, the design, simulation, and development of a prototype scaled source.

  4. The mechanical design and simulation of a scaled H- Penning ion source

    NASA Astrophysics Data System (ADS)

    Rutter, T.; Faircloth, D.; Turner, D.; Lawrie, S.

    2016-02-01

    The existing ISIS Penning H- source is unable to produce the beam parameters required for the front end test stand and so a new, high duty factor, high brightness scaled source is being developed. This paper details first the development of an electrically biased aperture plate for the existing ISIS source and second, the design, simulation, and development of a prototype scaled source.

  5. A kernel regression approach to gene-gene interaction detection for case-control studies.

    PubMed

    Larson, Nicholas B; Schaid, Daniel J

    2013-11-01

    Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.

  6. Simulating emissions of 1,3-dichloropropene after soil fumigation under field conditions.

    PubMed

    Yates, S R; Ashworth, D J

    2018-04-15

    Soil fumigation is an important agricultural practice used to produce many vegetable and fruit crops. However, fumigating soil can lead to atmospheric emissions which can increase risks to human and environmental health. A complete understanding of the transport, fate, and emissions of fumigants as impacted by soil and environmental processes is needed to mitigate atmospheric emissions. Five large-scale field experiments were conducted to measure emission rates for 1,3-dichloropropene (1,3-D), a soil fumigant commonly used in California. Numerical simulations of these experiments were conducted in predictive mode (i.e., no calibration) to determine if simulation could be used as a substitute for field experimentation to obtain information needed by regulators. The results show that the magnitude of the volatilization rate and the total emissions could be adequately predicted for these experiments, with the exception of a scenario where the field was periodically irrigated after fumigation. In addition, the timing of the daily peak 1,3-D emissions was not accurately predicted for these experiments due to the peak emission rates occurring during the night or early-morning hours. This study revealed that more comprehensive mathematical models (or adjustments to existing models) are needed to fully describe emissions of soil fumigants from field soils under typical agronomic conditions. Published by Elsevier B.V.

  7. A simulation of probabilistic wildfire risk components for the continental United States

    Treesearch

    Mark A. Finney; Charles W. McHugh; Isaac C. Grenfell; Karin L. Riley; Karen C. Short

    2011-01-01

    This simulation research was conducted in order to develop a large-fire risk assessment system for the contiguous land area of the United States. The modeling system was applied to each of 134 Fire Planning Units (FPUs) to estimate burn probabilities and fire size distributions. To obtain stable estimates of these quantities, fire ignition and growth was simulated for...

  8. Population dynamics of obligate cooperators

    PubMed Central

    Courchamp, F.; Grenfell, B.; Clutton-Brock, T.

    1999-01-01

    Obligate cooperative breeding species demonstrate a high rate of group extinction, which may be due to the existence of a critical number of helpers below which the group cannot subsist. Through a simple model, we study the population dynamics of obligate cooperative breeding species, taking into account the existence of a lower threshold below which the instantaneous growth rate becomes negative. The model successively incorporates (i) a distinction between species that need helpers for reproduction, survival or both, (ii) the existence of a migration rate accounting for dispersal, and (iii) stochastic mortality to simulate the effects of random catastrophic events. Our results suggest that the need for a minimum number of helpers increases the risk of extinction for obligate cooperative breeding species. The constraint imposed by this threshold is higher when helpers are needed for reproduction only or for both reproduction and survival. By driving them below this lower threshold, stochastic mortality of lower amplitude and/or lower frequency than for non-cooperative breeders may be sufficient to cause the extinction of obligate cooperative breeding groups. Migration may have a buffering effect only for groups where immigration is higher than emigration; otherwise (when immigrants from nearby groups are not available) it lowers the difference between actual group size and critical threshold, thereby constituting a higher constraint.

  9. Wildfire risk in the wildland-urban interface: A simulation study in northwestern Wisconsin

    USGS Publications Warehouse

    Bar-Massada, A.; Radeloff, V.C.; Stewart, S.I.; Hawbaker, T.J.

    2009-01-01

    The rapid growth of housing in and near the wildland-urban interface (WUI) increases wildfire risk to lives and structures. To reduce fire risk, it is necessary to identify WUI housing areas that are more susceptible to wildfire. This is challenging, because wildfire patterns depend on fire behavior and spread, which in turn depend on ignition locations, weather conditions, the spatial arrangement of fuels, and topography. The goal of our study was to assess wildfire risk to a 60,000 ha WUI area in northwestern Wisconsin while accounting for all of these factors. We conducted 6000 simulations with two dynamic fire models: Fire Area Simulator (FARSITE) and Minimum Travel Time (MTT) in order to map the spatial pattern of burn probabilities. Simulations were run under normal and extreme weather conditions to assess the effect of weather on fire spread, burn probability, and risk to structures. The resulting burn probability maps were intersected with maps of structure locations and land cover types. The simulations revealed clear hotspots of wildfire activity and a large range of wildfire risk to structures in the study area. As expected, the extreme weather conditions yielded higher burn probabilities over the entire landscape, as well as to different land cover classes and individual structures. Moreover, the spatial pattern of risk was significantly different between extreme and normal weather conditions. The results highlight the fact that extreme weather conditions not only produce higher fire risk than normal weather conditions, but also change the fine-scale locations of high risk areas in the landscape, which is of great importance for fire management in WUI areas. In addition, the choice of weather data may limit the potential for comparisons of risk maps for different areas and for extrapolating risk maps to future scenarios where weather conditions are unknown. Our approach to modeling wildfire risk to structures can aid fire risk reduction management activities by identifying areas with elevated wildfire risk and those most vulnerable under extreme weather conditions. ?? 2009 Elsevier B.V.

  10. An investigation into the vertical axis control power requirements for landing VTOL type aircraft onboard nonaviation ships in various sea states

    NASA Technical Reports Server (NTRS)

    Stevens, M. E.; Roskam, J.

    1985-01-01

    The problem of determining the vertical axis control requirements for landing a VTOL aircraft on a moving ship deck in various sea states is examined. Both a fixed-base piloted simulation and a nonpiloted simulation were used to determine the landing performance as influenced by thrust-to-weight ratio, vertical damping, and engine lags. The piloted simulation was run using a fixed-based simulator at Ames Research center. Simplified versions of an existing AV-8A Harrier model and an existing head-up display format were used. The ship model used was that of a DD963 class destroyer. Simplified linear models of the pilot, aircraft, ship motion, and ship air-wake turbulence were developed for the nonpiloted simulation. A unique aspect of the nonpiloted simulation was the development of a model of the piloting strategy used for shipboard landing. This model was refined during the piloted simulation until it provided a reasonably good representation of observed pilot behavior.

  11. Hydro- and morphodynamic tsunami simulations for the Ambrakian Gulf (Greece) and comparison with geoscientific field traces

    NASA Astrophysics Data System (ADS)

    Röbke, B. R.; Schüttrumpf, H.; Vött, A.

    2018-04-01

    In order to derive local tsunami risks for a particular coast, hydro- and morphodynamic numerical models that are calibrated and compared with sedimentary field data of past tsunami impacts have proven very effective. While this approach has widely been used with regard to recent tsunami events, comparable investigations into pre-/historical tsunami impacts hardly exist, which is the objective of this study focusing on the Ambrakian Gulf in northwestern Greece. The Ambrakian Gulf is located in the most active seismotectonic and by this most tsunamigenic area of the Mediterranean. Accordingly, palaeotsunami field studies have revealed repeated tsunami impacts on the gulf during the past 8000 yr. The current study analyses 151 vibracores of the Ambrakian Gulf coast in order to evaluate tsunami signals in the sedimentary record. Based on a hydro- and morphodynamic numerical model of the study area, various tsunami waves are simulated with the aim of finding scenarios that compare favourably with tsunami deposits detected in the field. Both, field data and simulation results suggest a decreasing tsunami influence from the western to the eastern Ambrakian Gulf. Various scenarios are needed to explain tsunami deposits in different parts of the gulf. Whereas shorter period tsunami waves (T = 30 min) from the south and west compare favourably with field data in the western gulf, longer period waves (T = 80 min) from a western direction show the best agreement with tsunami sediments detected in southwestern Aktio Headland and in the more central parts of the Ambrakian Gulf including Lake Voulkaria. Tsunamis from the southwest generally do not accord with field traces. Besides the spatial sediment distribution, the numerical model accurately reflects the sedimentary composition of the detected event deposits and reproduces a number of essential features typical of tsunamites, which were also observed in the field. Such include fining- and thinning-landward and the marine character of the deposits. By contrast, the simulated thickness of tsunami sediments usually lags behind the observed thickness in the field and some event layers cannot be explained by any of the simulated scenarios. Regarding the frequency of past tsunami events and their spatial dimensions indicated by both field data and simulation results, a high tsunami risk has to be derived for the Ambrakian Gulf.

  12. Mapping and DOWNFLOW simulation of recent lava flow fields at Mount Etna

    NASA Astrophysics Data System (ADS)

    Tarquini, Simone; Favalli, Massimiliano

    2011-07-01

    In recent years, progress in geographic information systems (GIS) and remote sensing techniques have allowed the mapping and studying of lava flows in unprecedented detail. A composite GIS technique is introduced to obtain high resolution boundaries of lava flow fields. This technique is mainly based on the processing of LIDAR-derived maps and digital elevation models (DEMs). The probabilistic code DOWNFLOW is then used to simulate eight large flow fields formed at Mount Etna in the last 25 years. Thanks to the collection of 6 DEMs representing Mount Etna at different times from 1986 to 2007, simulated outputs are obtained by running the DOWNFLOW code over pre-emplacement topographies. Simulation outputs are compared with the boundaries of the actual flow fields obtained here or derived from the existing literature. Although the selected fields formed in accordance with different emplacement mechanisms, flowed on different zones of the volcano over different topographies and were fed by different lava supplies of different durations, DOWNFLOW yields results close to the actual flow fields in all the cases considered. This outcome is noteworthy because DOWNFLOW has been applied by adopting a default calibration, without any specific tuning for the new cases considered here. This extensive testing proves that, if the pre-emplacement topography is available, DOWNFLOW yields a realistic simulation of a future lava flow based solely on a knowledge of the vent position. In comparison with deterministic codes, which require accurate knowledge of a large number of input parameters, DOWNFLOW turns out to be simple, fast and undemanding, proving to be ideal for systematic hazard and risk analyses.

  13. Proximity Operations for Space Situational Awareness Spacecraft Rendezvous and Maneuvering using Numerical Simulations and Fuzzy Logic

    NASA Astrophysics Data System (ADS)

    Carrico, T.; Langster, T.; Carrico, J.; Alfano, S.; Loucks, M.; Vallado, D.

    The authors present several spacecraft rendezvous and close proximity maneuvering techniques modeled with a high-precision numerical integrator using full force models and closed loop control with a Fuzzy Logic intelligent controller to command the engines. The authors document and compare the maneuvers, fuel use, and other parameters. This paper presents an innovative application of an existing capability to design, simulate and analyze proximity maneuvers; already in use for operational satellites performing other maneuvers. The system has been extended to demonstrate the capability to develop closed loop control laws to maneuver spacecraft in close proximity to another, including stand-off, docking, lunar landing and other operations applicable to space situational awareness, space based surveillance, and operational satellite modeling. The fully integrated end-to-end trajectory ephemerides are available from the authors in electronic ASCII text by request. The benefits of this system include: A realistic physics-based simulation for the development and validation of control laws A collaborative engineering environment for the design, development and tuning of spacecraft law parameters, sizing actuators (i.e., rocket engines), and sensor suite selection. An accurate simulation and visualization to communicate the complexity, criticality, and risk of spacecraft operations. A precise mathematical environment for research and development of future spacecraft maneuvering engineering tasks, operational planning and forensic analysis. A closed loop, knowledge-based control example for proximity operations. This proximity operations modeling and simulation environment will provide a valuable adjunct to programs in military space control, space situational awareness and civil space exploration engineering and decision making processes.

  14. A health economic model to determine the long-term costs and clinical outcomes of raising low HDL-cholesterol in the prevention of coronary heart disease.

    PubMed

    Roze, S; Liens, D; Palmer, A; Berger, W; Tucker, D; Renaudin, C

    2006-12-01

    The aim of this study was to describe a health economic model developed to project lifetime clinical and cost outcomes of lipid-modifying interventions in patients not reaching target lipid levels and to assess the validity of the model. The internet-based, computer simulation model is made up of two decision analytic sub-models, the first utilizing Monte Carlo simulation, and the second applying Markov modeling techniques. Monte Carlo simulation generates a baseline cohort for long-term simulation by assigning an individual lipid profile to each patient, and applying the treatment effects of interventions under investigation. The Markov model then estimates the long-term clinical (coronary heart disease events, life expectancy, and quality-adjusted life expectancy) and cost outcomes up to a lifetime horizon, based on risk equations from the Framingham study. Internal and external validation analyses were performed. The results of the model validation analyses, plotted against corresponding real-life values from Framingham, 4S, AFCAPS/TexCAPS, and a meta-analysis by Gordon et al., showed that the majority of values were close to the y = x line, which indicates a perfect fit. The R2 value was 0.9575 and the gradient of the regression line was 0.9329, both very close to the perfect fit (= 1). Validation analyses of the computer simulation model suggest the model is able to recreate the outcomes from published clinical studies and would be a valuable tool for the evaluation of new and existing therapy options for patients with persistent dyslipidemia.

  15. Crew Exploration Vehicle (CEV) (Orion) Occupant Protection. [Appendices Part 2

    NASA Technical Reports Server (NTRS)

    Currie-Gregg, Nancy J.; Gernhardt, Michael L.; Lawrence, Charles; Somers, Jeffrey T.

    2016-01-01

    The purpose of this study was to determine the similarity between the response of the THUMS model and the Hybrid III Anthropometric Test Device (ATD) given existing Wright-Patterson (WP) sled tests. There were four tests selected for this comparison with frontal, spinal, rear, and lateral loading. The THUMS was placed in a sled configuration that replicated the WP configuration and the recorded seat acceleration for each test was applied to model seat. Once the modeling simulations were complete, they were compared to the WP results using two methods. The first was a visual inspection of the sled test videos compared to the THUMS d3plot files. This comparison resulted in an assessment of the overall kinematics of the two results. The other comparison was a comparison of the plotted data recorded for both tests. The metrics selected for comparison were seat acceleration, belt forces, head acceleration and chest acceleration. These metrics were recorded in all WP tests and were outputs of the THUMS model. Once the comparison of the THUMS to the WP tests was complete, the THUMS model output was also examined for possible injuries in these scenarios. These outputs included metrics for injury risk to the head, neck, thorax, lumbar spine and lower extremities. The metrics to evaluate head response were peak head acceleration, HIC15, and HIC36. For the neck, N (sub ij) was calculated. The thorax response was evaluated with peak chest acceleration, the Combined Thoracic Index (CTI), sternal deflection, chest deflection, and chest acceleration- 3 ms clip. The lumbar spine response was evaluated with lumbar spine force. Finally the lower extremity response was evaluated by femur and tibia force. The results of the simulation comparisons indicate the THUMS model had a similar response to the Hybrid III dummy given the same input. The primary difference seen between the two was a more flexible response of the THUMS compared to the Hybrid III. This flexibility was most pronounced in the neck flexion, shoulder deflection and chest deflection. Due to the flexibility of the THUMS, the resulting head and chest accelerations tended to lag the Hybrid III acceleration trace and have a lower peak value. The results of the injury metric comparison identified possible injury trends between simulations. Risk of head injury was highest for the lateral simulations. The risk of chest injury was highest for the rear impact. However, neck injury risk was approximately the same for all simulations. The injury metric value for lumbar spine force was highest for the spinal impact. The leg forces were highest for the rear and lateral impacts. The results of this comparison indicate the THUMS model performs in a similar manner as the Hybrid III ATD. The differences in the responses of model and the ATD are primarily due to the flexibility of the THUMS. This flexibility of the THUMS would be a more human like response. Based on the similarity between the two models, the THUMS should be used in further testing to assess risk of injury to the occupant.

  16. Source apportionment of polycyclic aromatic hydrocarbons in Louisiana

    NASA Astrophysics Data System (ADS)

    Han, F.; Zhang, H.

    2017-12-01

    Polycyclic aromatic hydrocarbons (PAHs) in the environment are of significant concern due to their high toxicity that may result in adverse health effects. PAHs measurements at the limited air quality monitoring stations alone are insufficient to gain a complete concept of ambient PAH levels. This study simulates the concentrations of PAHs in Louisiana and identifies the major emission sources. Speciation profiles for PAHs were prepared using data assembled from existing emission profile databases. The Sparse Matrix Operator Kernel Emission (SMOKE) model was used to generate the estimated gridded emissions of 16 priority PAH species directly associated with health risks. The estimated emissions were then applied to simulate ambient concentrations of PAHs in Louisiana for January, April, July and October 2011 using the Community Multiscale Air Quality (CMAQ) model (v5.0.1). Through the formation, transport and deposition of PAHs species, the concentrations of PAHs species in gas phase and particulate phase were obtained. The spatial and temporal variations were analyzed and contributions of both local and regional major sources were quantified. This study provides important information for the prevention and treatment of PAHs in Louisiana.

  17. A molecular dynamic study on the dissociation mechanism of SI methane hydrate in inorganic salt aqueous solutions.

    PubMed

    Xu, Jiafang; Chen, Zhe; Liu, Jinxiang; Sun, Zening; Wang, Xiaopu; Zhang, Jun

    2017-08-01

    Gas hydrate is not only a potential energy resource, but also almost the biggest challenge in oil/gas flow assurance. Inorganic salts such as NaCl, KCl and CaCl 2 are widely used as the thermodynamic inhibitor to reduce the risk caused by hydrate formation. However, the inhibition mechanism is still unclear. Therefore, molecular dynamic (MD) simulation was performed to study the dissociation of structure I (SI) methane hydrate in existence of inorganic salt aqueous solution on a micro-scale. The simulation results showed that, the dissociation became stagnant due to the presence of liquid film formed by the decomposed water molecules, and more inorganic ions could shorten the stagnation-time. The diffusion coefficients of ions and water molecules were the largest in KCl system. The structures of ion/H 2 O and H 2 O/H 2 O were the most compact in hydrate/NaCl system. The ionic ability to decompose hydrate cells followed the sequence of: Ca 2+ >2K + >2Cl - >2Na + . Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Identification of androgen receptor antagonists: In vitro investigation and classification methodology for flavonoid.

    PubMed

    Wu, Yang; Doering, Jon A; Ma, Zhiyuan; Tang, Song; Liu, Hongling; Zhang, Xiaowei; Wang, Xiaoxiang; Yu, Hongxia

    2016-09-01

    A tremendous gap exists between the number of potential endocrine disrupting chemicals (EDCs) possibly in the environment and the limitation of traditional regulatory testing. In this study, the anti-androgenic potencies of 21 flavonoids were analyzed in vitro, and another 32 flavonoids from the literature were selected as additional chemicals. Molecular dynamic simulations were employed to obtain four different separation approaches based on the different behaviors of ligands and receptors during the process of interaction. Specifically, ligand-receptor complex which highlighted the discriminating features of ligand escape or retention via "mousetrap" mechanism, hydrogen bonds formed during simulation times, ligand stability and the stability of the helix-12 of the receptor were investigated. Together, a methodology was generated that 87.5% of flavonoids could be discriminated as active versus inactive antagonists, and over 90% inactive antagonists could be filtered out before QSAR study. This methodology could be used as a "proof of concept" to identify inactive anti-androgenic flavonoids, as well could be beneficial for rapid risk assessment and regulation of multiple new chemicals for androgenicity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. How to simulate pedestrian behaviors in seismic evacuation for vulnerability reduction of existing buildings

    NASA Astrophysics Data System (ADS)

    Quagliarini, Enrico; Bernardini, Gabriele; D'Orazio, Marco

    2017-07-01

    Understanding and representing how individuals behave in earthquake emergencies would be essentially to assess the impact of vulnerability reduction strategies on existing buildings in seismic areas. In fact, interactions between individuals and the scenario (modified by the earthquake occurrence) are really important in order to understand the possible additional risks for people, especially during the evacuation phase. The current approach is based on "qualitative" aspects, in order to define best practice guidelines for Civil Protection and populations. On the contrary, a "quantitative" description of human response and evacuation motion in similar conditions is urgently needed. Hence, this work defines the rules for pedestrians' earthquake evacuation in urban scenarios, by taking advantages of previous results of real-world evacuation analyses. In particular, motion laws for pedestrians is defined by modifying the Social Force model equation. The proposed model could be used for evaluating individuals' evacuation process and so for defining operative strategies for interferences reduction in critical urban fabric parts (e.g.: interventions on particular buildings, evacuation strategies definition, city parts projects).

  20. Assessment of the risk due to release of carbon fiber in civil aircraft accidents, phase 2

    NASA Technical Reports Server (NTRS)

    Pocinki, L.; Cornell, M. E.; Kaplan, L.

    1980-01-01

    The risk associated with the potential use of carbon fiber composite material in commercial jet aircraft is investigated. A simulation model developed to generate risk profiles for several airports is described. The risk profiles show the probability that the cost due to accidents in any year exceeds a given amount. The computer model simulates aircraft accidents with fire, release of fibers, their downwind transport and infiltration of buildings, equipment failures, and resulting ecomomic impact. The individual airport results were combined to yield the national risk profile.

  1. A Hardware-in-the-Loop Simulator for Software Development for a Mars Airplane

    NASA Technical Reports Server (NTRS)

    Slagowski, Stefan E.; Vican, Justin E.; Kenney, P. Sean

    2007-01-01

    Draper Laboratory recently developed a Hardware-In-The-Loop Simulator (HILSIM) to provide a simulation of the Aerial Regional-scale Environmental Survey (ARES) airplane executing a mission in the Martian environment. The HILSIM was used to support risk mitigation activities under the Planetary Airplane Risk Reduction (PARR) program. PARR supported NASA Langley Research Center's (LaRC) ARES proposal efforts for the Mars Scout 2011 opportunity. The HILSIM software was a successful integration of two simulation frameworks, Draper's CSIM and NASA LaRC's Langley Standard Real-Time Simulation in C++ (LaSRS++).

  2. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  3. Prototype of a computer method for designing and analyzing heating, ventilating and air conditioning proportional, electronic control systems

    NASA Astrophysics Data System (ADS)

    Barlow, Steven J.

    1986-09-01

    The Air Force needs a better method of designing new and retrofit heating, ventilating and air conditioning (HVAC) control systems. Air Force engineers currently use manual design/predict/verify procedures taught at the Air Force Institute of Technology, School of Civil Engineering, HVAC Control Systems course. These existing manual procedures are iterative and time-consuming. The objectives of this research were to: (1) Locate and, if necessary, modify an existing computer-based method for designing and analyzing HVAC control systems that is compatible with the HVAC Control Systems manual procedures, or (2) Develop a new computer-based method of designing and analyzing HVAC control systems that is compatible with the existing manual procedures. Five existing computer packages were investigated in accordance with the first objective: MODSIM (for modular simulation), HVACSIM (for HVAC simulation), TRNSYS (for transient system simulation), BLAST (for building load and system thermodynamics) and Elite Building Energy Analysis Program. None were found to be compatible or adaptable to the existing manual procedures, and consequently, a prototype of a new computer method was developed in accordance with the second research objective.

  4. Simulability of observables in general probabilistic theories

    NASA Astrophysics Data System (ADS)

    Filippov, Sergey N.; Heinosaari, Teiko; Leppäjärvi, Leevi

    2018-06-01

    The existence of incompatibility is one of the most fundamental features of quantum theory and can be found at the core of many of the theory's distinguishing features, such as Bell inequality violations and the no-broadcasting theorem. A scheme for obtaining new observables from existing ones via classical operations, the so-called simulation of observables, has led to an extension of the notion of compatibility for measurements. We consider the simulation of observables within the operational framework of general probabilistic theories and introduce the concept of simulation irreducibility. While a simulation irreducible observable can only be simulated by itself, we show that any observable can be simulated by simulation irreducible observables, which in the quantum case correspond to extreme rank-1 positive-operator-valued measures. We also consider cases where the set of simulators is restricted in one of two ways: in terms of either the number of simulating observables or their number of outcomes. The former is seen to be closely connected to compatibility and k compatibility, whereas the latter leads to a partial characterization for dichotomic observables. In addition to the quantum case, we further demonstrate these concepts in state spaces described by regular polygons.

  5. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes.

    PubMed

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.

  6. A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Amini, Ahmad; Jamil, Norziana

    2018-05-01

    Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.

  7. Simulated transition from RCP8.5 to RCP4.5 through three different Radiation Management techniques

    NASA Astrophysics Data System (ADS)

    Muri, H.; Kristjansson, J. E.; Adakudlu, M.; Grini, A.; Lauvset, S. K.; Otterå, O. H.; Schulz, M.; Tjiputra, J. F.

    2016-12-01

    Scenario studies have shown that in order to limit global warming to 2°C above pre-industrial levels, negative CO2 emissions are required. Currently, no safe and well-established technologies exist for achieving such negative emissions. Hence, although carbon dioxide removal may appear less risky and controversial than Radiation Management (RM) techniques, the latter type of climate engineering (CE) techniques cannot be ruled out as a future policy option. The EXPECT project, funded by the Norwegian Research Council, explores the potential and risks of RM through Earth System Model Simulations. We here describe results from a study that simulates a 21st century transition from an RCP8.5 to a RCP4.5 scenario through Radiation Management. The study uses the Norwegian Earth System Model (NorESM) to compare the results from the following three RM techniques: a) Stratospheric Aerosol Injections (SAI); b) Marine Sky Brightening (MSB); c) Cirrus Cloud Thinning (CCT). All three simulations start from the year 2020 and run until 2100. Whereas both SAI and MSB successfully simulate the desired negative radiative forcing throughout the 21st century, the CCT simulations have a +0.5 W m-2 residual forcing (on top of RCP4.5) at the end of the century. Although all three techniques obtain approximately the same global temperature evolution, precipitation responses are very different. In particular, the CCT simulation has even more globally averaged precipitation at year 2100 than RCP8.5, whereas both SAI and MSB simulate less precipitation than RCP4.5. In addition, there are significant differences in geographical patterns of precipitation. Natural variability in the Earth System also exhibits sensitivity to the choice of RM technique: Both the Atlantic Meridional Overturning Circulation and the Pacific Decadal Oscillation respond differently to the choice of SAI, MSB or CCT. We will present a careful analysis, as well as a physical interpretation of the above results.

  8. The mechanical design and simulation of a scaled H{sup −} Penning ion source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutter, T., E-mail: theo.rutter@stfc.ac.uk; Faircloth, D.; Turner, D.

    2016-02-15

    The existing ISIS Penning H{sup −} source is unable to produce the beam parameters required for the front end test stand and so a new, high duty factor, high brightness scaled source is being developed. This paper details first the development of an electrically biased aperture plate for the existing ISIS source and second, the design, simulation, and development of a prototype scaled source.

  9. An Advanced, Interactive, High-Performance Liquid Chromatography Simulator and Instructor Resources

    ERIC Educational Resources Information Center

    Boswell, Paul G.; Stoll, Dwight R.; Carr, Peter W.; Nagel, Megan L.; Vitha, Mark F.; Mabbott, Gary A.

    2013-01-01

    High-performance liquid chromatography (HPLC) simulation software has long been recognized as an effective educational tool, yet many of the existing HPLC simulators are either too expensive, outdated, or lack many important features necessary to make them widely useful for educational purposes. Here, a free, open-source HPLC simulator is…

  10. Virtual reality simulators and training in laparoscopic surgery.

    PubMed

    Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos

    2015-01-01

    Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  11. Grip Strength and Its Relationship to Police Recruit Task Performance and Injury Risk: A Retrospective Cohort Study

    PubMed Central

    Stierli, Michael; Hinton, Benjamin

    2017-01-01

    Suitable grip strength is a police occupational requirement. The aim of this study was to investigate the association between grip strength, task performance and injury risk in a police population. Retrospective data of police recruits (n = 169) who had undergone basic recruit training were provided, including handgrip strength results, occupational task performance measures (consisting of police task simulations [SIM], tactical options [TACOPS] and marksmanship assessments) and injury records. Left hand grip strength (41.91 ± 8.29 kg) measures showed a stronger correlation than right hand grip strength (42.15 ± 8.53 kg) with all outcome measures. Recruits whose grip strength scores were lower were significantly more susceptible to failing the TACOPS occupational task assessment than those with greater grip strength scores, with significant (p ≤ 0.003) weak to moderate, positive correlations found between grip strength and TACOPS performance. A significant (p < 0.0001) correlation was found between grip strength, most notably of the left hand, and marksmanship performance, with those performing better in marksmanship having higher grip strength. Left hand grip strength was significantly associated with injury risk (r = −0.181, p = 0.018) but right hand grip strength was not. A positive association exists between handgrip strength and police recruit task performance (notably TACOPS and marksmanship) with recruits who scored poorly on grip strength being at greatest risk of occupational assessment task failure. PMID:28825688

  12. Grip Strength and Its Relationship to Police Recruit Task Performance and Injury Risk: A Retrospective Cohort Study.

    PubMed

    Orr, Robin; Pope, Rodney; Stierli, Michael; Hinton, Benjamin

    2017-08-21

    Suitable grip strength is a police occupational requirement. The aim of this study was to investigate the association between grip strength, task performance and injury risk in a police population. Retrospective data of police recruits (n = 169) who had undergone basic recruit training were provided, including handgrip strength results, occupational task performance measures (consisting of police task simulations [SIM], tactical options [TACOPS] and marksmanship assessments) and injury records. Left hand grip strength (41.91 ± 8.29 kg) measures showed a stronger correlation than right hand grip strength (42.15 ± 8.53 kg) with all outcome measures. Recruits whose grip strength scores were lower were significantly more susceptible to failing the TACOPS occupational task assessment than those with greater grip strength scores, with significant ( p ≤ 0.003) weak to moderate, positive correlations found between grip strength and TACOPS performance. A significant ( p < 0.0001) correlation was found between grip strength, most notably of the left hand, and marksmanship performance, with those performing better in marksmanship having higher grip strength. Left hand grip strength was significantly associated with injury risk ( r = -0.181, p = 0.018) but right hand grip strength was not. A positive association exists between handgrip strength and police recruit task performance (notably TACOPS and marksmanship) with recruits who scored poorly on grip strength being at greatest risk of occupational assessment task failure.

  13. A hybrid CNN feature model for pulmonary nodule malignancy risk differentiation.

    PubMed

    Wang, Huafeng; Zhao, Tingting; Li, Lihong Connie; Pan, Haixia; Liu, Wanquan; Gao, Haoqi; Han, Fangfang; Wang, Yuehai; Qi, Yifan; Liang, Zhengrong

    2018-01-01

    The malignancy risk differentiation of pulmonary nodule is one of the most challenge tasks of computer-aided diagnosis (CADx). Most recently reported CADx methods or schemes based on texture and shape estimation have shown relatively satisfactory on differentiating the risk level of malignancy among the nodules detected in lung cancer screening. However, the existing CADx schemes tend to detect and analyze characteristics of pulmonary nodules from a statistical perspective according to local features only. Enlightened by the currently prevailing learning ability of convolutional neural network (CNN), which simulates human neural network for target recognition and our previously research on texture features, we present a hybrid model that takes into consideration of both global and local features for pulmonary nodule differentiation using the largest public database founded by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). By comparing three types of CNN models in which two of them were newly proposed by us, we observed that the multi-channel CNN model yielded the best discrimination in capacity of differentiating malignancy risk of the nodules based on the projection of distributions of extracted features. Moreover, CADx scheme using the new multi-channel CNN model outperformed our previously developed CADx scheme using the 3D texture feature analysis method, which increased the computed area under a receiver operating characteristic curve (AUC) from 0.9441 to 0.9702.

  14. The method of belief scales as a means for dealing with uncertainty in tough regulatory decisions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilch, Martin M.

    Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities andmore » uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.« less

  15. Agricultural livelihoods in coastal Bangladesh under climate and environmental change--a model framework.

    PubMed

    Lázár, Attila N; Clarke, Derek; Adams, Helen; Akanda, Abdur Razzaque; Szabo, Sylvia; Nicholls, Robert J; Matthews, Zoe; Begum, Dilruba; Saleh, Abul Fazal M; Abedin, Md Anwarul; Payo, Andres; Streatfield, Peter Kim; Hutton, Craig; Mondal, M Shahjahan; Moslehuddin, Abu Zofar Md

    2015-06-01

    Coastal Bangladesh experiences significant poverty and hazards today and is highly vulnerable to climate and environmental change over the coming decades. Coastal stakeholders are demanding information to assist in the decision making processes, including simulation models to explore how different interventions, under different plausible future socio-economic and environmental scenarios, could alleviate environmental risks and promote development. Many existing simulation models neglect the complex interdependencies between the socio-economic and environmental system of coastal Bangladesh. Here an integrated approach has been proposed to develop a simulation model to support agriculture and poverty-based analysis and decision-making in coastal Bangladesh. In particular, we show how a simulation model of farmer's livelihoods at the household level can be achieved. An extended version of the FAO's CROPWAT agriculture model has been integrated with a downscaled regional demography model to simulate net agriculture profit. This is used together with a household income-expenses balance and a loans logical tree to simulate the evolution of food security indicators and poverty levels. Modelling identifies salinity and temperature stress as limiting factors to crop productivity and fertilisation due to atmospheric carbon dioxide concentrations as a reinforcing factor. The crop simulation results compare well with expected outcomes but also reveal some unexpected behaviours. For example, under current model assumptions, temperature is more important than salinity for crop production. The agriculture-based livelihood and poverty simulations highlight the critical significance of debt through informal and formal loans set at such levels as to persistently undermine the well-being of agriculture-dependent households. Simulations also indicate that progressive approaches to agriculture (i.e. diversification) might not provide the clear economic benefit from the perspective of pricing due to greater susceptibility to climate vagaries. The livelihood and poverty results highlight the importance of the holistic consideration of the human-nature system and the careful selection of poverty indicators. Although the simulation model at this stage contains the minimum elements required to simulate the complexity of farmer livelihood interactions in coastal Bangladesh, the crop and socio-economic findings compare well with expected behaviours. The presented integrated model is the first step to develop a holistic, transferable analytic method and tool for coastal Bangladesh.

  16. A virtual reality patient simulation system for teaching emergency response skills to U.S. Navy medical providers.

    PubMed

    Freeman, K M; Thompson, S F; Allely, E B; Sobel, A L; Stansfield, S A; Pugh, W M

    2001-01-01

    Rapid and effective medical intervention in response to civil and military-related disasters is crucial for saving lives and limiting long-term disability. Inexperienced providers may suffer in performance when faced with limited supplies and the demands of stabilizing casualties not generally encountered in the comparatively resource-rich hospital setting. Head trauma and multiple injury cases are particularly complex to diagnose and treat, requiring the integration and processing of complex multimodal data. In this project, collaborators adapted and merged existing technologies to produce a flexible, modular patient simulation system with both three-dimensional virtual reality and two-dimensional flat screen user interfaces for teaching cognitive assessment and treatment skills. This experiential, problem-based training approach engages the user in a stress-filled, high fidelity world, providing multiple learning opportunities within a compressed period of time and without risk. The system simulates both the dynamic state of the patient and the results of user intervention, enabling trainees to watch the virtual patient deteriorate or stabilize as a result of their decision-making speed and accuracy. Systems can be deployed to the field enabling trainees to practice repeatedly until their skills are mastered and to maintain those skills once acquired. This paper describes the technologies and the process used to develop the trainers, the clinical algorithms, and the incorporation of teaching points. We also characterize aspects of the actual simulation exercise through the lens of the trainee.

  17. Development of cost-effective surfactant flooding technology. First annual report for the period, September 30, 1992--September 29, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G.A.; Sepehrnoori, K.

    1994-08-01

    This research consists of the parallel development of a new chemical flooding simulator and the application of existing UTCHEM simulation code to model surfactant flooding. The new code is based upon a completely new numerical method that combines for the first time higher order finite difference methods, flux limiters, and implicit algorithms. Early results indicate that this approach has significant advantages in some problems and will likely enable simulation of much larger and more realistic chemical floods once it is fully developed. Additional improvements have also been made to the UTCHEM code and it has been applied for the firstmore » time to the study of stochastic reservoirs with and without horizontal wells to evaluate methods to reduce the cost and risk of surfactant flooding. During the first year of this contract, significant progress has been made on both of these tasks. The authors have found that there are indeed significant differences between the performance predictions based upon the traditional layered reservoir description and the more realistic and flexible descriptions using geostatistics. These preliminary studies of surfactant flooding using horizontal wells shows that although they have significant potential to greatly reduce project life and thus improve the economics of the process, their use requires accurate reservoir descriptions and simulations to be effective. Much more needs to be done to fully understand and optimize their use and develop reliable design criteria.« less

  18. Modeling for the allocation of oil spill recovery capacity considering environmental and economic factors.

    PubMed

    Ha, Min-Jae

    2018-01-01

    This study presents a regional oil spill risk assessment and capacities for marine oil spill response in Korea. The risk assessment of oil spill is carried out using both causal factors and environmental/economic factors. The weight of each parameter is calculated using the Analytic Hierarchy Process (AHP). Final regional risk degrees of oil spill are estimated by combining the degree and weight of each existing parameter. From these estimated risk levels, oil recovery capacities were determined with reference to the recovery target of 7500kl specified in existing standards. The estimates were deemed feasible, and provided a more balanced distribution of resources than existing capacities set according to current standards. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Simulated effects of proposed Arkansas Valley Conduit on hydrodynamics and water quality for projected demands through 2070, Pueblo Reservoir, southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.

    2013-01-01

    The purpose of the Arkansas Valley Conduit (AVC) is to deliver water for municipal and industrial use within the boundaries of the Southeastern Colorado Water Conservancy District. Water supplied through the AVC would serve two needs: (1) to supplement or replace existing poor-quality water to communities downstream from Pueblo Reservoir; and (2) to meet a portion of the AVC participants’ projected water demands through 2070. The Bureau of Reclamation (Reclamation) initiated an Environmental Impact Statement (EIS) to address the potential environmental consequences associated with constructing and operating the proposed AVC, entering into a conveyance contract for the Pueblo Dam north-south outlet works interconnect (Interconnect), and entering into a long-term excess capacity master contract (Master Contract). Operational changes, as a result of implementation of proposed EIS alternatives, could change the hydrodynamics and water-quality conditions in Pueblo Reservoir. An interagency agreement was initiated between Reclamation and the U.S. Geological Survey to accurately simulate hydrodynamics and water quality in Pueblo Reservoir for projected demands associated with four of the seven proposed EIS alternatives. The four alternatives submitted to the USGS for scenario simulation included various combinations (action or no action) of the proposed Arkansas Valley Conduit, Master Contract, and Interconnect options. The four alternatives were the No Action, Comanche South, Joint Use Pipeline North, and Master Contract Only. Additionally, scenario simulations were done that represented existing conditions (Existing Conditions scenario) in Pueblo Reservoir. Water-surface elevations, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, total iron, and algal biomass (measured as chlorophyll-a) were simulated. Each of the scenarios was simulated for three contiguous water years representing a wet, average, and dry annual hydrologic cycle. Each selected simulation scenario also was evaluated for differences in direct/indirect effects and cumulative effects on a particular scenario. Analysis of the results for the direct/indirect- and cumulative-effects analyses indicated that, in general, the results were similar for most of the scenarios and comparisons in this report focused on results from the direct/indirect-effects analyses. Scenario simulations that represented existing conditions in Pueblo Reservoir were compared to the No Action scenario to assess changes in water quality from current demands (2006) to projected demands in 2070. Overall, comparisons of the results between the Existing Conditions and the No Action scenarios for water-surface elevations, water temperature, and dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, and total iron concentrations indicated that the annual median values generally were similar for all three simulated years. Additionally, algal groups and chlorophyll-a concentrations (algal biomass) were similar for the Existing Conditions and the No Action scenarios at site 7B in the epilimnion for the simulated period (Water Year 2000 through 2002). The No Action scenario also was compared individually to the Comanche South, Joint Use Pipeline North, and Master Contract Only scenarios. These comparisons were made to describe changes in the annual median, 85th percentile, or 15th percentile concentration between the No Action scenario and each of the other three simulation scenarios. Simulated water-surface elevations, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, total iron, algal groups, and chlorophyll-a concentrations in Pueblo Reservoir generally were similar between the No Action scenario and each of the other three simulation scenarios.

  20. Vapor intrusion risk of lead scavengers 1,2-dibromoethane (EDB) and 1,2-dichloroethane (DCA).

    PubMed

    Ma, Jie; Li, Haiyan; Spiese, Richard; Wilson, John; Yan, Guangxu; Guo, Shaohui

    2016-06-01

    Vapor intrusion of synthetic fuel additives represented a critical yet still neglected problem at sites impacted by petroleum fuel releases. This study used an advanced numerical model to simulate the vapor intrusion risk of lead scavengers 1,2-dibromoethane (ethylene dibromide, EDB) and 1,2-dichloroethane (DCA) under different site conditions. We found that simulated EDB and DCA indoor air concentrations can exceed USEPA screening level (4.7 × 10(-3) μg/m(3) for EDB and 1.1 × 10(-1) μg/m(3) for DCA) if the source concentration is high enough (is still within the concentration range found at leaking UST site). To evaluate the chance that vapor intrusion of EDB might exceed the USEPA screening levels for indoor air, the simulation results were compared to the distribution of EDB at leaking UST sites in the US. If there is no degradation of EDB or only abiotic degradation of EDB, from 15% to 37% of leaking UST sites might exceed the USEPA screening level. This study supports the statements made by USEPA in the Petroleum Vapor Intrusion (PVI) Guidance that the screening criteria for petroleum hydrocarbon may not provide sufficient protectiveness for fuel releases containing EDB and DCA. Based on a thorough literature review, we also compiled previous published data on the EDB and DCA groundwater source concentrations and their degradation rates. These data are valuable in evaluating EDB and DCA vapor intrusion risk. In addition, a set of refined attenuation factors based on site-specific information (e.g., soil types, source depths, and degradation rates) were provided for establishing site-specific screening criteria for EDB and DCA. Overall, this study points out that lead scavengers EDB and DCA may cause vapor intrusion problems. As more field data of EDB and DCA become available, we recommend that USEPA consider including these data in the existing PVI database and possibly revising the PVI Guidance as necessary. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Choice of time-scale in Cox's model analysis of epidemiologic cohort data: a simulation study.

    PubMed

    Thiébaut, Anne C M; Bénichou, Jacques

    2004-12-30

    Cox's regression model is widely used for assessing associations between potential risk factors and disease occurrence in epidemiologic cohort studies. Although age is often a strong determinant of disease risk, authors have frequently used time-on-study instead of age as the time-scale, as for clinical trials. Unless the baseline hazard is an exponential function of age, this approach can yield different estimates of relative hazards than using age as the time-scale, even when age is adjusted for. We performed a simulation study in order to investigate the existence and magnitude of bias for different degrees of association between age and the covariate of interest. Age to disease onset was generated from exponential, Weibull or piecewise Weibull distributions, and both fixed and time-dependent dichotomous covariates were considered. We observed no bias upon using age as the time-scale. Upon using time-on-study, we verified the absence of bias for exponentially distributed age to disease onset. For non-exponential distributions, we found that bias could occur even when the covariate of interest was independent from age. It could be severe in case of substantial association with age, especially with time-dependent covariates. These findings were illustrated on data from a cohort of 84,329 French women followed prospectively for breast cancer occurrence. In view of our results, we strongly recommend not using time-on-study as the time-scale for analysing epidemiologic cohort data. 2004 John Wiley & Sons, Ltd.

  2. [Visual abilities of older drivers--review of driving simulator studies].

    PubMed

    Andysz, Aleksandra; Merecz, Dorota

    2012-01-01

    In the member countries of the year Organization for Economic Co-operation and Development (OECD), one in four people will reach the age of 65 or more by 2030 and their population aged over 80 will triple by 2050. Changes that occur in the demographic structure of developed countries will affect traffic area. Most of the on-road existing solutions is inadequate for older people with diminished cognitive and motor abilities. In this group, difficulties in driving performance are associated with reduced cognitive efficiency, vision and hearing loss, and general psychomotor slowing. The presented review focuses on the studies of a useful field of view, an indicator considered to be a valid predictor of road accidents, divided attention, susceptibility to distraction and visual search strategies. The major questions of these studies were: which vision parameters determine safe driving, what degree of their deterioration causes significant risk and whether there are opportunities for their rehabilitation. The results indicate that older drivers do exhibit vision and attention deficits, but their engagement in a wide range of compensatory behaviors and effective visual search strategies compensate for these deficits. This shows that older drivers cannot be clearly classified as a group of particular risk for causing road traffic accidents. We should not be alarmed by a growing group of active senior drivers. We should rather use the advantages of available methods, including driving simulators, to predict how the traffic environment will look like in the close future and how to make it friendly and safe for everyone.

  3. Notes on recent approaches concerning the Kirchhoff-law-Johnson-noise-based secure key exchange

    NASA Astrophysics Data System (ADS)

    Kish, Laszlo B.; Horvath, Tamas

    2009-08-01

    We critically analyze the results and claims in [P.-L. Liu, Phys. Lett. A 373 (2009) 901]. We show that the strong security leak appeared in the simulations is only an artifact and not caused by “multiple reflections”. Since no wave modes exist at cable length of 5% of the shortest wavelength of the signal, no wave is present to reflect it. In the high wave impedance limit, the conditions used in the simulations are heavily unphysical (requiring cable diameters up to 28000 times greater than the measured size of the known universe) and the results are modeling artifacts due to the unphysical values. At the low cable impedance limit, the observed artifacts are due to violating the recommended (and tested) conditions by neglecting the cable capacitance restrictions and using about 100 times longer cable than recommended without cable capacitance compensation arrangement. We implement and analyze the general circuitry of Liu's circulator [P.-L. Liu, Phys. Lett. A 373 (2009) 901] and confirm that they are conceptually secure against passive attacks. We introduce an asymmetric, more robust version without feedback loop. Then we crack all these systems by an active attack: a circulator-based man-in-the middle attack. Finally, we analyze the proposed method to increase security by dropping only high-risk bits. We point out the differences between different types of high-risk bits and show the shortage of this strategy for some simple key exchange protocols.

  4. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    NASA Astrophysics Data System (ADS)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  5. NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less

  6. Design of the Experimental Exposure Conditions to Simulate Ionizing Radiation Effects on Candidate Replacement Materials for the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Smith, L. Montgomery

    1998-01-01

    In this effort, experimental exposure times for monoenergetic electrons and protons were determined to simulate the space radiation environment effects on Teflon components of the Hubble Space Telescope. Although the energy range of the available laboratory particle accelerators was limited, optimal exposure times for 50 keV, 220 keV, 350 keV, and 500 KeV electrons were calculated that produced a dose-versus-depth profile that approximated the full spectrum profile, and were realizable with existing equipment. For the case of proton exposure, the limited energy range of the laboratory accelerator restricted simulation of the dose to a depth of .5 mil. Also, while optimal exposure times were found for 200 keV, 500 keV and 700 keV protons that simulated the full spectrum dose-versus-depth profile to this depth, they were of such short duration that the existing laboratory could not be controlled to within the required accuracy. In addition to the obvious experimental issues, other areas exist in which the analytical work could be advanced. Improved computer codes for the dose prediction- along with improved methodology for data input and output- would accelerate and make more accurate the calculational aspects. This is particularly true in the case of proton fluxes where a paucity of available predictive software appears to exist. The dated nature of many of the existing Monte Carlo particle/radiation transport codes raises the issue as to whether existing codes are sufficient for this type of analysis. Other areas that would result in greater fidelity of laboratory exposure effects to the space environment is the use of a larger number of monoenergetic particle fluxes and improved optimization algorithms to determine the weighting values.

  7. Defense Portfolio Analysis

    DTIC Science & Technology

    2009-06-01

    Valuation’s Risk Simulator..............................................46 viii 6. Palisade @RISK (http://www.palisade.com...71 APPENDIX B. PALISADE @RISK MODELING DATA AND ANALYSIS..................79 A. PALISADE @RISK...values ...81 3. @RISK Model Sorted by EMV ..............................................................82 4. Palisade @RISK Data Analysis

  8. Estimating risks of heat strain by age and sex: a population-level simulation model.

    PubMed

    Glass, Kathryn; Tait, Peter W; Hanna, Elizabeth G; Dear, Keith

    2015-05-18

    Individuals living in hot climates face health risks from hyperthermia due to excessive heat. Heat strain is influenced by weather exposure and by individual characteristics such as age, sex, body size, and occupation. To explore the population-level drivers of heat strain, we developed a simulation model that scales up individual risks of heat storage (estimated using Myrup and Morgan's man model "MANMO") to a large population. Using Australian weather data, we identify high-risk weather conditions together with individual characteristics that increase the risk of heat stress under these conditions. The model identifies elevated risks in children and the elderly, with females aged 75 and older those most likely to experience heat strain. Risk of heat strain in males does not increase as rapidly with age, but is greatest on hot days with high solar radiation. Although cloudy days are less dangerous for the wider population, older women still have an elevated risk of heat strain on hot cloudy days or when indoors during high temperatures. Simulation models provide a valuable method for exploring population level risks of heat strain, and a tool for evaluating public health and other government policy interventions.

  9. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  10. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  11. Simulation and Shoulder Dystocia.

    PubMed

    Shaddeau, Angela K; Deering, Shad

    2016-12-01

    Shoulder dystocia is an unpredictable obstetric emergency that requires prompt interventions to ensure optimal outcomes. Proper technique is important but difficult to train given the urgent and critical clinical situation. Simulation training for shoulder dystocia allows providers at all levels to practice technical and teamwork skills in a no-risk environment. Programs utilizing simulation training for this emergency have consistently demonstrated improved performance both during practice drills and in actual patients with significantly decreased risks of fetal injury. Given the evidence, simulation training for shoulder dystocia should be conducted at all institutions that provide delivery services.

  12. Fine-Scale Mapping by Spatial Risk Distribution Modeling for Regional Malaria Endemicity and Its Implications under the Low-to-Moderate Transmission Setting in Western Cambodia

    PubMed Central

    Okami, Suguru; Kohtake, Naohiko

    2016-01-01

    The disease burden of malaria has decreased as malaria elimination efforts progress. The mapping approach that uses spatial risk distribution modeling needs some adjustment and reinvestigation in accordance with situational changes. Here we applied a mathematical modeling approach for standardized morbidity ratio (SMR) calculated by annual parasite incidence using routinely aggregated surveillance reports, environmental data such as remote sensing data, and non-environmental anthropogenic data to create fine-scale spatial risk distribution maps of western Cambodia. Furthermore, we incorporated a combination of containment status indicators into the model to demonstrate spatial heterogeneities of the relationship between containment status and risks. The explanatory model was fitted to estimate the SMR of each area (adjusted Pearson correlation coefficient R2 = 0.774; Akaike information criterion AIC = 149.423). A Bayesian modeling framework was applied to estimate the uncertainty of the model and cross-scale predictions. Fine-scale maps were created by the spatial interpolation of estimated SMRs at each village. Compared with geocoded case data, corresponding predicted values showed conformity [Spearman’s rank correlation r = 0.662 in the inverse distance weighed interpolation and 0.645 in ordinal kriging (95% confidence intervals of 0.414–0.827 and 0.368–0.813, respectively), Welch’s t-test; Not significant]. The proposed approach successfully explained regional malaria risks and fine-scale risk maps were created under low-to-moderate malaria transmission settings where reinvestigations of existing risk modeling approaches were needed. Moreover, different representations of simulated outcomes of containment status indicators for respective areas provided useful insights for tailored interventional planning, considering regional malaria endemicity. PMID:27415623

  13. Quantitative risk stratification in Markov chains with limiting conditional distributions.

    PubMed

    Chan, David C; Pollett, Philip K; Weinstein, Milton C

    2009-01-01

    Many clinical decisions require patient risk stratification. The authors introduce the concept of limiting conditional distributions, which describe the equilibrium proportion of surviving patients occupying each disease state in a Markov chain with death. Such distributions can quantitatively describe risk stratification. The authors first establish conditions for the existence of a positive limiting conditional distribution in a general Markov chain and describe a framework for risk stratification using the limiting conditional distribution. They then apply their framework to a clinical example of a treatment indicated for high-risk patients, first to infer the risk of patients selected for treatment in clinical trials and then to predict the outcomes of expanding treatment to other populations of risk. For the general chain, a positive limiting conditional distribution exists only if patients in the earliest state have the lowest combined risk of progression or death. The authors show that in their general framework, outcomes and population risk are interchangeable. For the clinical example, they estimate that previous clinical trials have selected the upper quintile of patient risk for this treatment, but they also show that expanded treatment would weakly dominate this degree of targeted treatment, and universal treatment may be cost-effective. Limiting conditional distributions exist in most Markov models of progressive diseases and are well suited to represent risk stratification quantitatively. This framework can characterize patient risk in clinical trials and predict outcomes for other populations of risk.

  14. A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.

    A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the simplified model, and then optimized the embattling of ground-based radar surveillance network with the artificial intelligent algorithm, which can greatly simplifies the computational complexities. Comparing with the traditional method, the proposed method greatly improved the computational efficiency.

  15. Simulating the influence of groundwater table fluctuation on vapor intrusion

    NASA Astrophysics Data System (ADS)

    Huo, J.

    2017-12-01

    The migration of volatile chemicals from groundwater to an overlying building is a commonly existing phenomenon around the world. Due to the distinction of hydrologic conditions among vapor intrusion sites, it is necessary to consider the effect of dominant hydrologic factors in order to obtain a precise site evaluation and a health risk assessment during the screening process. This study mainly discusses the impact of groundwater table fluctuation and other hydrological factors including porosity, permeability and soil moisture on the vapor intrusion transport. A two-dimensional model is configured to inject different typical volatile organic contaminants from EPA's Vapor Intrusion Database. Through quantifying the contaminant vapor concentration attenuation factors under the effect of groundwater table fluctuation, this study provides suggestions for indoor air sample and vapor intrusion assessment.

  16. Risk Assessment of Carbon Sequestration into A Naturally Fractured Reservoir at Kevin Dome, Montana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Minh; Onishi, Tsubasa; Carey, James William

    In this report, we describe risk assessment work done using the National Risk Assessment Partnership (NRAP) applied to CO 2 storage at Kevin Dome, Montana. Geologic CO 2 sequestration in saline aquifers poses certain risks including CO 2/brine leakage through wells or non-sealing faults into groundwater or to the land surface. These risks are difficult to quantify due to data availability and uncertainty. One solution is to explore the consequences of these limitations by running large numbers of numerical simulations on the primary CO2 injection reservoir, shallow reservoirs/aquifers, faults, and wells to assess leakage risks and uncertainties. However, a largemore » number of full-physics simulations is usually too computationally expensive. The NRAP integrated assessment model (NRAP-IAM) uses reduced order models (ROMs) developed from full-physics simulations to address this issue. A powerful stochastic framework allows NRAPIAM to explore complex interactions among many uncertain variables and evaluate the likely performance of potential sequestration sites.« less

  17. The Analysis of Rush Orders Risk in Supply Chain: A Simulation Approach

    NASA Technical Reports Server (NTRS)

    Mahfouz, Amr; Arisha, Amr

    2011-01-01

    Satisfying customers by delivering demands at agreed time, with competitive prices, and in satisfactory quality level are crucial requirements for supply chain survival. Incidence of risks in supply chain often causes sudden disruptions in the processes and consequently leads to customers losing their trust in a company's competence. Rush orders are considered to be one of the main types of supply chain risks due to their negative impact on the overall performance, Using integrated definition modeling approaches (i.e. IDEF0 & IDEF3) and simulation modeling technique, a comprehensive integrated model has been developed to assess rush order risks and examine two risk mitigation strategies. Detailed functions sequence and objects flow were conceptually modeled to reflect on macro and micro levels of the studied supply chain. Discrete event simulation models were then developed to assess and investigate the mitigation strategies of rush order risks, the objective of this is to minimize order cycle time and cost.

  18. Simulated Order Verification and Medication Reconciliation during an Introductory Pharmacy Practice Experience.

    PubMed

    Metzger, Nicole L; Chesson, Melissa M; Momary, Kathryn M

    2015-09-25

    Objective. To create, implement, and assess a simulated medication reconciliation and an order verification activity using hospital training software. Design. A simulated patient with medication orders and home medications was built into existing hospital training software. Students in an institutional introductory pharmacy practice experience (IPPE) reconciled the patient's medications and determined whether or not to verify the inpatient orders based on his medical history and laboratory data. After reconciliation, students identified medication discrepancies and documented their rationale for rejecting inpatient orders. Assessment. For a 3-year period, the majority of students agreed the simulation enhanced their learning, taught valuable clinical decision-making skills, integrated material from previous courses, and stimulated their interest in institutional pharmacy. Overall feedback from student evaluations about the IPPE also was favorable. Conclusion. Use of existing hospital training software can affordably simulate the pharmacist's role in order verification and medication reconciliation, as well as improve clinical decision-making.

  19. Assessing risk-adjustment approaches under non-random selection.

    PubMed

    Luft, Harold S; Dudley, R Adams

    2004-01-01

    Various approaches have been proposed to adjust for differences in enrollee risk in health plans. Because risk-selection strategies may have different effects on enrollment, we simulated three types of selection--dumping, skimming, and stinting. Concurrent diagnosis-based risk adjustment, and a hybrid using concurrent adjustment for about 8% of the cases and prospective adjustment for the rest, perform markedly better than prospective or demographic adjustments, both in terms of R2 and the extent to which plans experience unwarranted gains or losses. The simulation approach offers a valuable tool for analysts in assessing various risk-adjustment strategies under different selection situations.

  20. Math modeling and computer mechanization for real time simulation of rotary-wing aircraft

    NASA Technical Reports Server (NTRS)

    Howe, R. M.

    1979-01-01

    Mathematical modeling and computer mechanization for real time simulation of rotary wing aircraft is discussed. Error analysis in the digital simulation of dynamic systems, such as rotary wing aircraft is described. The method for digital simulation of nonlinearities with discontinuities, such as exist in typical flight control systems and rotor blade hinges, is discussed.

  1. OSS: OSSOS Survey Simulator

    NASA Astrophysics Data System (ADS)

    Petit, J.-M.; Kavelaars, J. J.; Gladman, B.; Alexandersen, M.

    2018-05-01

    Comparing properties of discovered trans-Neptunian Objects (TNOs) with dynamical models is impossible due to the observational biases that exist in surveys. The OSSOS Survey Simulator takes an intrinsic orbital model (from, for example, the output of a dynamical Kuiper belt emplacement simulation) and applies the survey biases, so the biased simulated objects can be directly compared with real discoveries.

  2. Simulation Using Novel Equipment Designed to Explain Spirometric Abnormalities in Respiratory Disease Enhances Learning in Higher Cognitive Domains

    ERIC Educational Resources Information Center

    Jamison, J. P.; Stewart, M. T.

    2015-01-01

    Simulation of disorders of respiratory mechanics shown by spirometry provides insight into the pathophysiology of disease but some clinically important disorders have not been simulated and none have been formally evaluated for education. We have designed simple mechanical devices which, along with existing simulators, enable all the main…

  3. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    PubMed

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Transforming GIS data into functional road models for large-scale traffic simulation.

    PubMed

    Wilkie, David; Sewall, Jason; Lin, Ming C

    2012-06-01

    There exists a vast amount of geographic information system (GIS) data that model road networks around the world as polylines with attributes. In this form, the data are insufficient for applications such as simulation and 3D visualization-tools which will grow in power and demand as sensor data become more pervasive and as governments try to optimize their existing physical infrastructure. In this paper, we propose an efficient method for enhancing a road map from a GIS database to create a geometrically and topologically consistent 3D model to be used in real-time traffic simulation, interactive visualization of virtual worlds, and autonomous vehicle navigation. The resulting representation provides important road features for traffic simulations, including ramps, highways, overpasses, legal merge zones, and intersections with arbitrary states, and it is independent of the simulation methodologies. We test the 3D models of road networks generated by our algorithm on real-time traffic simulation using both macroscopic and microscopic techniques.

  5. Spatio-Temporal Modelling of Dust Transport over Surface Mining Areas and Neighbouring Residential Zones.

    PubMed

    Matejicek, Lubos; Janour, Zbynek; Benes, Ludek; Bodnar, Tomas; Gulikova, Eva

    2008-06-06

    Projects focusing on spatio-temporal modelling of the living environment need to manage a wide range of terrain measurements, existing spatial data, time series, results of spatial analysis and inputs/outputs from numerical simulations. Thus, GISs are often used to manage data from remote sensors, to provide advanced spatial analysis and to integrate numerical models. In order to demonstrate the integration of spatial data, time series and methods in the framework of the GIS, we present a case study focused on the modelling of dust transport over a surface coal mining area, exploring spatial data from 3D laser scanners, GPS measurements, aerial images, time series of meteorological observations, inputs/outputs form numerical models and existing geographic resources. To achieve this, digital terrain models, layers including GPS thematic mapping, and scenes with simulation of wind flows are created to visualize and interpret coal dust transport over the mine area and a neighbouring residential zone. A temporary coal storage and sorting site, located near the residential zone, is one of the dominant sources of emissions. Using numerical simulations, the possible effects of wind flows are observed over the surface, modified by natural objects and man-made obstacles. The coal dust drifts with the wind in the direction of the residential zone and is partially deposited in this area. The simultaneous display of the digital map layers together with the location of the dominant emission source, wind flows and protected areas enables a risk assessment of the dust deposition in the area of interest to be performed. In order to obtain a more accurate simulation of wind flows over the temporary storage and sorting site, 3D laser scanning and GPS thematic mapping are used to create a more detailed digital terrain model. Thus, visualization of wind flows over the area of interest combined with 3D map layers enables the exploration of the processes of coal dust deposition at a local scale. In general, this project could be used as a template for dust-transport modelling which couples spatial data focused on the construction of digital terrain models and thematic mapping with data generated by numerical simulations based on Reynolds averaged Navier-Stokes equations.

  6. Spatio-Temporal Modelling of Dust Transport over Surface Mining Areas and Neighbouring Residential Zones

    PubMed Central

    Matejicek, Lubos; Janour, Zbynek; Benes, Ludek; Bodnar, Tomas; Gulikova, Eva

    2008-01-01

    Projects focusing on spatio-temporal modelling of the living environment need to manage a wide range of terrain measurements, existing spatial data, time series, results of spatial analysis and inputs/outputs from numerical simulations. Thus, GISs are often used to manage data from remote sensors, to provide advanced spatial analysis and to integrate numerical models. In order to demonstrate the integration of spatial data, time series and methods in the framework of the GIS, we present a case study focused on the modelling of dust transport over a surface coal mining area, exploring spatial data from 3D laser scanners, GPS measurements, aerial images, time series of meteorological observations, inputs/outputs form numerical models and existing geographic resources. To achieve this, digital terrain models, layers including GPS thematic mapping, and scenes with simulation of wind flows are created to visualize and interpret coal dust transport over the mine area and a neighbouring residential zone. A temporary coal storage and sorting site, located near the residential zone, is one of the dominant sources of emissions. Using numerical simulations, the possible effects of wind flows are observed over the surface, modified by natural objects and man-made obstacles. The coal dust drifts with the wind in the direction of the residential zone and is partially deposited in this area. The simultaneous display of the digital map layers together with the location of the dominant emission source, wind flows and protected areas enables a risk assessment of the dust deposition in the area of interest to be performed. In order to obtain a more accurate simulation of wind flows over the temporary storage and sorting site, 3D laser scanning and GPS thematic mapping are used to create a more detailed digital terrain model. Thus, visualization of wind flows over the area of interest combined with 3D map layers enables the exploration of the processes of coal dust deposition at a local scale. In general, this project could be used as a template for dust-transport modelling which couples spatial data focused on the construction of digital terrain models and thematic mapping with data generated by numerical simulations based on Reynolds averaged Navier-Stokes equations. PMID:27879911

  7. CFD Simulation of Spread Risks of Infectious Disease due to Interactive Wind and Ventilation Airflows via Window Openings in High-Rise Buildings

    NASA Astrophysics Data System (ADS)

    Niu, J. L.; Gao, N. P.

    2010-05-01

    One of the concerns is that there may exist multiple infectious disease transmission routes across households in high-rise residential buildings, one of which is the natural ventilative airflow through open windows between flats, caused by buoyancy effects. This study presents the modeling of this cascade effect using computational fluid dynamics (CFD) technique. It is found that the presence of the pollutants generated in the lower floor is generally lower in the immediate upper floor by two orders of magnitude, but the risk of infection calculated by the Wells-Riley equation is only around one order of magnitude lower. It is found that, with single-side open-window conditions, wind blowing perpendicularly to the building may either reinforce or suppress the upward transport, depending on the wind speed. High-speed winds can restrain the convective transfer of heat and mass between flats, functioning like an air curtain. Despite the complexities of the air flow involved, it is clear that this transmission route should be taken into account in infection control.

  8. The design and progress of a multidomain lifestyle intervention to improve brain health in middle-aged persons to reduce later Alzheimer's disease risk: The Gray Matters randomized trial.

    PubMed

    Norton, Maria C; Clark, Christine J; Tschanz, JoAnn T; Hartin, Phillip; Fauth, Elizabeth B; Gast, Julie A; Dorsch, Travis E; Wengreen, Heidi; Nugent, Chris; Robinson, W David; Lefevre, Michael; McClean, Sally; Cleland, Ian; Schaefer, Sydney Y; Aguilar, Sheryl

    2015-06-01

    Most Alzheimer's disease (AD) prevention studies focus on older adults or persons with existing cognitive impairment. This study describes the design and progress of a novel pilot intervention, the Gray Matters study. This proof-of-concept randomized controlled trial tests an evidence-based multidomain lifestyle intervention in 146 persons aged 40 to 64 years, in northern Utah. Data collectors were blinded to participants' randomization to treatment (n = 104) or control (n = 42). Intervention targeted physical activity, food choices, social engagement, cognitive simulation, sleep quality, and stress management, and uses a custom smartphone application, activity monitor, and educational materials. Secondary outcomes include biomarkers, body mass index, cognitive testing, and psychological surveys. Midway through the study, achievements include a 98.7% retention rate, a 96% rate of compliance with app data entry, and positive trends in behavioral change. Participants were empowered, learning that lifestyle might impact AD risk, exhibiting positive behavioral changes thus far.

  9. Comparing and combining biomarkers as principle surrogates for time-to-event clinical endpoints.

    PubMed

    Gabriel, Erin E; Sachs, Michael C; Gilbert, Peter B

    2015-02-10

    Principal surrogate endpoints are useful as targets for phase I and II trials. In many recent trials, multiple post-randomization biomarkers are measured. However, few statistical methods exist for comparison of or combination of biomarkers as principal surrogates, and none of these methods to our knowledge utilize time-to-event clinical endpoint information. We propose a Weibull model extension of the semi-parametric estimated maximum likelihood method that allows for the inclusion of multiple biomarkers in the same risk model as multivariate candidate principal surrogates. We propose several methods for comparing candidate principal surrogates and evaluating multivariate principal surrogates. These include the time-dependent and surrogate-dependent true and false positive fraction, the time-dependent and the integrated standardized total gain, and the cumulative distribution function of the risk difference. We illustrate the operating characteristics of our proposed methods in simulations and outline how these statistics can be used to evaluate and compare candidate principal surrogates. We use these methods to investigate candidate surrogates in the Diabetes Control and Complications Trial. Copyright © 2014 John Wiley & Sons, Ltd.

  10. A Synthetic Vision Preliminary Integrated Safety Analysis

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Houser, Scott

    2001-01-01

    This report documents efforts to analyze a sample of aviation safety programs, using the LMI-developed integrated safety analysis tool to determine the change in system risk resulting from Aviation Safety Program (AvSP) technology implementation. Specifically, we have worked to modify existing system safety tools to address the safety impact of synthetic vision (SV) technology. Safety metrics include reliability, availability, and resultant hazard. This analysis of SV technology is intended to be part of a larger effort to develop a model that is capable of "providing further support to the product design and development team as additional information becomes available". The reliability analysis portion of the effort is complete and is fully documented in this report. The simulation analysis is still underway; it will be documented in a subsequent report. The specific goal of this effort is to apply the integrated safety analysis to SV technology. This report also contains a brief discussion of data necessary to expand the human performance capability of the model, as well as a discussion of human behavior and its implications for system risk assessment in this modeling environment.

  11. Modeling the Population Health Impact of Introducing a Modified Risk Tobacco Product into the U.S. Market.

    PubMed

    Djurdjevic, Smilja; Lee, Peter N; Weitkunat, Rolf; Sponsiello-Wang, Zheng; Lüdicke, Frank; Baker, Gizelle

    2018-05-16

    Philip Morris International (PMI) has developed the Population Health Impact Model (PHIM) to quantify, in the absence of epidemiological data, the effects of marketing a candidate modified risk tobacco product (cMRTP) on the public health of a whole population. Various simulations were performed to understand the harm reduction impact on the U.S. population over a 20-year period under various scenarios. The overall reduction in smoking attributable deaths (SAD) over the 20-year period was estimated as 934,947 if smoking completely went away and between 516,944 and 780,433 if cMRTP use completely replaces smoking. The reduction in SADs was estimated as 172,458 for the World Health Organization (WHO) 2025 Target and between 70,274 and 90,155 for the gradual cMRTP uptake. Combining the scenarios (WHO 2025 Target and cMRTP uptake), the reductions were between 256,453 and 268,796, depending on the cMRTP relative exposure. These results show how a cMRTP can reduce overall population harm additionally to existing tobacco control efforts.

  12. [Incentive for Regional Risk Selection in the German Risk Structure Compensation Scheme].

    PubMed

    Wende, Danny

    2017-10-01

    The introduction of the new law GKV-FQWG strengthens the competition between statutory health insurance. If incentives for risk selection exist, they may force a battle for cheap customers. This study aims to document and discuss incentives for regional risk selection in the German risk structure compensation scheme. Identify regional autocorrelation with Moran's l on financial parameters of the risk structure compensation schema. Incentives for regional risk selection do indeed exist. The risk structure compensation schema reduces 91% of the effect and helps to reduce risk selection. Nevertheless, a connection between regional situation and competition could be shown (correlation: 69.5%). Only the integration of regional control variables into the risk compensation eliminates regional autocorrelation. The actual risk structure compensation is leading to regional inequalities and as a consequence to risk selection and distortion in competition. © Georg Thieme Verlag KG Stuttgart · New York.

  13. In-Vivo Assessment of Femoral Bone Strength Using Finite Element Analysis (FEA) Based on Routine MDCT Imaging: A Preliminary Study on Patients with Vertebral Fractures

    PubMed Central

    Liebl, Hans; Garcia, Eduardo Grande; Holzner, Fabian; Noel, Peter B.; Burgkart, Rainer; Rummeny, Ernst J.; Baum, Thomas; Bauer, Jan S.

    2015-01-01

    Purpose To experimentally validate a non-linear finite element analysis (FEA) modeling approach assessing in-vitro fracture risk at the proximal femur and to transfer the method to standard in-vivo multi-detector computed tomography (MDCT) data of the hip aiming to predict additional hip fracture risk in subjects with and without osteoporosis associated vertebral fractures using bone mineral density (BMD) measurements as gold standard. Methods One fresh-frozen human femur specimen was mechanically tested and fractured simulating stance and clinically relevant fall loading configurations to the hip. After experimental in-vitro validation, the FEA simulation protocol was transferred to standard contrast-enhanced in-vivo MDCT images to calculate individual hip fracture risk each for 4 subjects with and without a history of osteoporotic vertebral fractures matched by age and gender. In addition, FEA based risk factor calculations were compared to manual femoral BMD measurements of all subjects. Results In-vitro simulations showed good correlation with the experimentally measured strains both in stance (R2 = 0.963) and fall configuration (R2 = 0.976). The simulated maximum stress overestimated the experimental failure load (4743 N) by 14.7% (5440 N) while the simulated maximum strain overestimated by 4.7% (4968 N). The simulated failed elements coincided precisely with the experimentally determined fracture locations. BMD measurements in subjects with a history of osteoporotic vertebral fractures did not differ significantly from subjects without fragility fractures (femoral head: p = 0.989; femoral neck: p = 0.366), but showed higher FEA based risk factors for additional incident hip fractures (p = 0.028). Conclusion FEA simulations were successfully validated by elastic and destructive in-vitro experiments. In the subsequent in-vivo analyses, MDCT based FEA based risk factor differences for additional hip fractures were not mirrored by according BMD measurements. Our data suggests, that MDCT derived FEA models may assess bone strength more accurately than BMD measurements alone, providing a valuable in-vivo fracture risk assessment tool. PMID:25723187

  14. Wildfire risk in the wildland-urban interface: A simulation study in northwestern Wisconsin

    USGS Publications Warehouse

    Massada, Avi Bar; Radeloff, Volker C.; Stewart, Susan I.; Hawbaker, Todd J.

    2009-01-01

    The rapid growth of housing in and near the wildland–urban interface (WUI) increases wildfirerisk to lives and structures. To reduce fire risk, it is necessary to identify WUI housing areas that are more susceptible to wildfire. This is challenging, because wildfire patterns depend on fire behavior and spread, which in turn depend on ignition locations, weather conditions, the spatial arrangement of fuels, and topography. The goal of our study was to assess wildfirerisk to a 60,000 ha WUI area in northwesternWisconsin while accounting for all of these factors. We conducted 6000 simulations with two dynamic fire models: Fire Area Simulator (FARSITE) and Minimum Travel Time (MTT) in order to map the spatial pattern of burn probabilities. Simulations were run under normal and extreme weather conditions to assess the effect of weather on fire spread, burn probability, and risk to structures. The resulting burn probability maps were intersected with maps of structure locations and land cover types. The simulations revealed clear hotspots of wildfire activity and a large range of wildfirerisk to structures in the study area. As expected, the extreme weather conditions yielded higher burn probabilities over the entire landscape, as well as to different land cover classes and individual structures. Moreover, the spatial pattern of risk was significantly different between extreme and normal weather conditions. The results highlight the fact that extreme weather conditions not only produce higher fire risk than normal weather conditions, but also change the fine-scale locations of high risk areas in the landscape, which is of great importance for fire management in WUI areas. In addition, the choice of weather data may limit the potential for comparisons of risk maps for different areas and for extrapolating risk maps to future scenarios where weather conditions are unknown. Our approach to modeling wildfirerisk to structures can aid fire risk reduction management activities by identifying areas with elevated wildfirerisk and those most vulnerable under extreme weather conditions.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    Colorado School of Mines conducted research and training in the development and validation of an advanced CO{sub 2} GS (Geological Sequestration) probabilistic simulation and risk assessment model. CO{sub 2} GS simulation and risk assessment is used to develop advanced numerical simulation models of the subsurface to forecast CO2 behavior and transport; optimize site operational practices; ensure site safety; and refine site monitoring, verification, and accounting efforts. As simulation models are refined with new data, the uncertainty surrounding the identified risks decrease, thereby providing more accurate risk assessment. The models considered the full coupling of multiple physical processes (geomechanical and fluidmore » flow) and describe the effects of stochastic hydro-mechanical (H-M) parameters on the modeling of CO{sub 2} flow and transport in fractured porous rocks. Graduate students were involved in the development and validation of the model that can be used to predict the fate, movement, and storage of CO{sub 2} in subsurface formations, and to evaluate the risk of potential leakage to the atmosphere and underground aquifers. The main major contributions from the project include the development of: 1) an improved procedure to rigorously couple the simulations of hydro-thermomechanical (H-M) processes involved in CO{sub 2} GS; 2) models for the hydro-mechanical behavior of fractured porous rocks with random fracture patterns; and 3) probabilistic methods to account for the effects of stochastic fluid flow and geomechanical properties on flow, transport, storage and leakage associated with CO{sub 2} GS. The research project provided the means to educate and train graduate students in the science and technology of CO{sub 2} GS, with a focus on geologic storage. Specifically, the training included the investigation of an advanced CO{sub 2} GS simulation and risk assessment model that can be used to predict the fate, movement, and storage of CO{sub 2} in underground formations, and the evaluation of the risk of potential CO{sub 2} leakage to the atmosphere and underground aquifers.« less

  16. Transfer of training and simulator qualification or myth and folklore in helicopter simulation

    NASA Technical Reports Server (NTRS)

    Dohme, Jack

    1992-01-01

    Transfer of training studies at Fort Rucker using the backward-transfer paradigm have shown that existing flight simulators are not entirely adequate for meeting training requirements. Using an ab initio training research simulator, a simulation of the UH-1, training effectiveness ratios were developed. The data demonstrate it to be a cost-effective primary trainer. A simulator qualification method was suggested in which a combination of these transfer-of-training paradigms is used to determine overall simulator fidelity and training effectiveness.

  17. A stochastic agent-based model of pathogen propagation in dynamic multi-relational social networks

    PubMed Central

    Khan, Bilal; Dombrowski, Kirk; Saad, Mohamed

    2015-01-01

    We describe a general framework for modeling and stochastic simulation of epidemics in realistic dynamic social networks, which incorporates heterogeneity in the types of individuals, types of interconnecting risk-bearing relationships, and types of pathogens transmitted across them. Dynamism is supported through arrival and departure processes, continuous restructuring of risk relationships, and changes to pathogen infectiousness, as mandated by natural history; dynamism is regulated through constraints on the local agency of individual nodes and their risk behaviors, while simulation trajectories are validated using system-wide metrics. To illustrate its utility, we present a case study that applies the proposed framework towards a simulation of HIV in artificial networks of intravenous drug users (IDUs) modeled using data collected in the Social Factors for HIV Risk survey. PMID:25859056

  18. A Search for Cosmic String Loops Using GADGET-2 Cosmological N-Body Simulator

    NASA Astrophysics Data System (ADS)

    Braverman, William; Cousins, Bryce; Jia, Hewei

    2018-01-01

    Cosmic string loops are an extremely elusive hypothetical entity that have eluded the grasp of physicists and astronomers since their existence was postulated in the 1970’s. Finding evidence of their existence could be the first empirical evidence of string theory.Simulating their basic motion in a cold dark matter background using GADGET-2 allows us to predict where they may cluster during large scale structure formation (if they cluster at all). Here, we present our progress in placing cosmic strings into GADGET-2 with their basic equations of motion to lay a ground work for more complex simulations to find where these strings cluster. Ultimately, these simulations could lay a groundwork as to where future microlensing and gravitational wave observatories should look for cosmic strings.

  19. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  20. Supporting Parents: How Six Decades of Parenting Research Can Inform Policy and Best Practice. Social Policy Report. Volume 30, Number 5

    ERIC Educational Resources Information Center

    Teti, Douglas M.; Cole, Pamela M.; Cabrera, Natasha; Goodman, Sherryl H.; McLoyd, Vonnie C.

    2017-01-01

    In this paper, we call attention to the need to expand existing efforts and to develop policies, programs, and best practices in the United States designed to support parents at risk and promote parenting competence. Despite the existence of some services offered to parents of children at risk due to developmental delay or at economic risk, the…

  1. Introduction of a new laboratory test: an econometric approach with the use of neural network analysis.

    PubMed

    Jabor, A; Vlk, T; Boril, P

    1996-04-15

    We designed a simulation model for the assessment of the financial risks involved when a new diagnostic test is introduced in the laboratory. The model is based on a neural network consisting of ten neurons and assumes that input entities can have assigned appropriate uncertainty. Simulations are done on a 1-day interval basis. Risk analysis completes the model and the financial effects are evaluated for a selected time period. The basic output of the simulation consists of total expenses and income during the simulation time, net present value of the project at the end of simulation, total number of control samples during simulation, total number of patients evaluated and total number of used kits.

  2. Increased Susceptibility to Chemical Toxicity with Pre-existing ...

    EPA Pesticide Factsheets

    Numerous host and environmental factors may modulate vulnerability and risk. An area of increasing interest to risk assessors is the potential for chemicals to interact with pre-existing diseases and aging that may yield cumulative damage, altered chemical response, and increased disease susceptibility. We evaluated the relationships between chemicals and pre-existing disease and identify the type of information needed to evaluate the relationships of interest. Key among these is the existence of a clinically relevant and easy to measure biomarker of disease risk which is also modulated by a particular chemical of interest. This biomarker may be a physiological, biochemical, or genetic indicator that corresponds to a phase of the disease process and may be an indicator of where an individual is on the continuum of disease or health status. The relationship between chemical exposure and a biomarker may then be used to predict how preexisting conditions may modify health risks of chemical exposures. Several case studies are explored to describe the toxic chemical, the clinical biomarker, the impacted disease and the evidence that the chemical enhances disease risk: fine particulate matter/decreased heart rate variability/increased cardiopulmonary events; cadmium/decreased glomerular filtration ate/increased chronic kidney disease; methyl mercury/decreased paraoxonase-1/increased cardiovascular risk; Trichloroethylene/increased anti-nuclear antibody/autoimmunit

  3. Global and local scale flood discharge simulations in the Rhine River basin for flood risk reduction benchmarking in the Flagship Project

    NASA Astrophysics Data System (ADS)

    Gädeke, Anne; Gusyev, Maksym; Magome, Jun; Sugiura, Ai; Cullmann, Johannes; Takeuchi, Kuniyoshi

    2015-04-01

    The global flood risk assessment is prerequisite to set global measurable targets of post-Hyogo Framework for Action (HFA) that mobilize international cooperation and national coordination towards disaster risk reduction (DRR) and requires the establishment of a uniform flood risk assessment methodology on various scales. To address these issues, the International Flood Initiative (IFI) has initiated a Flagship Project, which was launched in year 2013, to support flood risk reduction benchmarking at global, national and local levels. In the Flagship Project road map, it is planned to identify the original risk (1), to identify the reduced risk (2), and to facilitate the risk reduction actions (3). In order to achieve this goal at global, regional and local scales, international research collaboration is absolutely necessary involving domestic and international institutes, academia and research networks such as UNESCO International Centres. The joint collaboration by ICHARM and BfG was the first attempt that produced the first step (1a) results on the flood discharge estimates with inundation maps under way. As a result of this collaboration, we demonstrate the outcomes of the first step of the IFI Flagship Project to identify flood hazard in the Rhine river basin on the global and local scale. In our assessment, we utilized a distributed hydrological Block-wise TOP (BTOP) model on 20-km and 0.5-km scales with local precipitation and temperature input data between 1980 and 2004. We utilized existing 20-km BTOP model, which is applied globally, and constructed the local scale 0.5-km BTOP model for the Rhine River basin. For the BTOP model results, both calibrated 20-km and 0.5-km BTOP models had similar statistical performance and represented observed flood river discharges, epecially for 1993 and 1995 floods. From 20-km and 0.5-km BTOP simulation, the flood discharges of the selected return period were estimated using flood frequency analysis and were comparable to the the river gauging station data at the German part of the Rhine river basin. This is an important finding that both 0.5-km and 20-km BTOP models produce similar flood peak discharges although the 0.5-km BTOP model results indicate the importance of scale in the local flood hazard assessment. In summary, we highlight that this study serves as a demonstrative example of institutional collaboration and is stepping stone for the next step implementation of the IFI Flagship Project.

  4. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  5. (U) Ristra Next Generation Code Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungerford, Aimee L.; Daniel, David John

    LANL’s Weapons Physics management (ADX) and ASC program office have defined a strategy for exascale-class application codes that follows two supportive, and mutually risk-mitigating paths: evolution for established codes (with a strong pedigree within the user community) based upon existing programming paradigms (MPI+X); and Ristra (formerly known as NGC), a high-risk/high-reward push for a next-generation multi-physics, multi-scale simulation toolkit based on emerging advanced programming systems (with an initial focus on data-flow task-based models exemplified by Legion [5]). Development along these paths is supported by the ATDM, IC, and CSSE elements of the ASC program, with the resulting codes forming amore » common ecosystem, and with algorithm and code exchange between them anticipated. Furthermore, solution of some of the more challenging problems of the future will require a federation of codes working together, using established-pedigree codes in partnership with new capabilities as they come on line. The role of Ristra as the high-risk/high-reward path for LANL’s codes is fully consistent with its role in the Advanced Technology Development and Mitigation (ATDM) sub-program of ASC (see Appendix C), in particular its emphasis on evolving ASC capabilities through novel programming models and data management technologies.« less

  6. Evaluating performance of risk identification methods through a large-scale simulation of observational data.

    PubMed

    Ryan, Patrick B; Schuemie, Martijn J

    2013-10-01

    There has been only limited evaluation of statistical methods for identifying safety risks of drug exposure in observational healthcare data. Simulations can support empirical evaluation, but have not been shown to adequately model the real-world phenomena that challenge observational analyses. To design and evaluate a probabilistic framework (OSIM2) for generating simulated observational healthcare data, and to use this data for evaluating the performance of methods in identifying associations between drug exposure and health outcomes of interest. Seven observational designs, including case-control, cohort, self-controlled case series, and self-controlled cohort design were applied to 399 drug-outcome scenarios in 6 simulated datasets with no effect and injected relative risks of 1.25, 1.5, 2, 4, and 10, respectively. Longitudinal data for 10 million simulated patients were generated using a model derived from an administrative claims database, with associated demographics, periods of drug exposure derived from pharmacy dispensings, and medical conditions derived from diagnoses on medical claims. Simulation validation was performed through descriptive comparison with real source data. Method performance was evaluated using Area Under ROC Curve (AUC), bias, and mean squared error. OSIM2 replicates prevalence and types of confounding observed in real claims data. When simulated data are injected with relative risks (RR) ≥ 2, all designs have good predictive accuracy (AUC > 0.90), but when RR < 2, no methods achieve 100 % predictions. Each method exhibits a different bias profile, which changes with the effect size. OSIM2 can support methodological research. Results from simulation suggest method operating characteristics are far from nominal properties.

  7. Simulating forest fuel and fire risk dynamics across landscapes--LANDIS fuel module design

    Treesearch

    Hong S. He; Bo Z. Shang; Thomas R. Crow; Eric J. Gustafson; Stephen R. Shifley

    2004-01-01

    Understanding fuel dynamics over large spatial (103-106 ha) and temporal scales (101-103 years) is important in comprehensive wildfire management. We present a modeling approach to simulate fuel and fire risk dynamics as well as impacts of alternative fuel treatments. The...

  8. Evaluation of pre-existing antibody presence as a risk factor for posttreatment anti-drug antibody induction: analysis of human clinical study data for multiple biotherapeutics.

    PubMed

    Xue, Li; Rup, Bonita

    2013-07-01

    Biotherapeutic-reactive antibodies in treatment-naïve subjects (i.e., pre-existing antibodies) have been commonly detected during clinical immunogenicity assessments; however information on pre-existing antibody prevalence, physiological effects, and impact on posttreatment anti-drug antibody (ADA) induction remains limited. In this analysis, pre-existing antibody prevalence and impact on posttreatment ADA induction were determined using ADA data from 12 biotherapeutics analyzed in 32 clinical studies. Approximately half (58%) of the biotherapeutics were associated with some level of pre-existing antibodies and 67% of those were associated with posttreatment ADA induction. Across all studies, 5.6% of study subjects demonstrated presence of pre-existing antibodies, among which, 17% of the individual subjects had posttreatment increases in their ADA titers while 16% had decreased titers and 67% had no change in titers. However, in studies conducted in the rheumatoid arthritis (RA) population, 14.8% of RA patients were associated with pre-existing antibodies and 30% of those had posttreatment titer increases. The results suggest that in most study subjects, pre-existing antibodies pose a low risk for posttreatment ADA induction. That said, the high risk of induction implicated for RA patients, primarily observed in treatments evaluating novel antibody-based constructs, indicates that further understanding of the contribution of product and disease-specific factors is needed. Cross-industry efforts to collect and analyze a larger data set would enhance understanding of the prevalence, nature, and physiological consequences of pre-existing antibodies, better inform the immunogenicity risk profiles of products associated with these antibodies and lead to better fit-for-purpose immunogenicity management and mitigation strategies.

  9. Comparing listeriosis risks in at-risk populations using a user-friendly quantitative microbial risk assessment tool and epidemiological data.

    PubMed

    Falk, L E; Fader, K A; Cui, D S; Totton, S C; Fazil, A M; Lammerding, A M; Smith, B A

    2016-10-01

    Although infection by the pathogenic bacterium Listeria monocytogenes is relatively rare, consequences can be severe, with a high case-fatality rate in vulnerable populations. A quantitative, probabilistic risk assessment tool was developed to compare estimates of the number of invasive listeriosis cases in vulnerable Canadian subpopulations given consumption of contaminated ready-to-eat delicatessen meats and hot dogs, under various user-defined scenarios. The model incorporates variability and uncertainty through Monte Carlo simulation. Processes considered within the model include cross-contamination, growth, risk factor prevalence, subpopulation susceptibilities, and thermal inactivation. Hypothetical contamination events were simulated. Results demonstrated varying risk depending on the consumer risk factors and implicated product (turkey delicatessen meat without growth inhibitors ranked highest for this scenario). The majority (80%) of listeriosis cases were predicted in at-risk subpopulations comprising only 20% of the total Canadian population, with the greatest number of predicted cases in the subpopulation with dialysis and/or liver disease. This tool can be used to simulate conditions and outcomes under different scenarios, such as a contamination event and/or outbreak, to inform public health interventions.

  10. Mammographic density, breast cancer risk and risk prediction

    PubMed Central

    Vachon, Celine M; van Gils, Carla H; Sellers, Thomas A; Ghosh, Karthik; Pruthi, Sandhya; Brandt, Kathleen R; Pankratz, V Shane

    2007-01-01

    In this review, we examine the evidence for mammographic density as an independent risk factor for breast cancer, describe the risk prediction models that have incorporated density, and discuss the current and future implications of using mammographic density in clinical practice. Mammographic density is a consistent and strong risk factor for breast cancer in several populations and across age at mammogram. Recently, this risk factor has been added to existing breast cancer risk prediction models, increasing the discriminatory accuracy with its inclusion, albeit slightly. With validation, these models may replace the existing Gail model for clinical risk assessment. However, absolute risk estimates resulting from these improved models are still limited in their ability to characterize an individual's probability of developing cancer. Promising new measures of mammographic density, including volumetric density, which can be standardized using full-field digital mammography, will likely result in a stronger risk factor and improve accuracy of risk prediction models. PMID:18190724

  11. Advancing renal education: hybrid simulation, using simulated patients to enhance realism in haemodialysis education.

    PubMed

    Dunbar-Reid, Kylie; Sinclair, Peter M; Hudson, Denis

    2015-06-01

    Simulation is a well-established and proven teaching method, yet its use in renal education is not widely reported. Criticisms of simulation-based teaching include limited realism and a lack of authentic patient interaction. This paper discusses the benefits and challenges of high-fidelity simulation and suggests hybrid simulation as a complementary model to existing simulation programmes. Through the use of a simulated patient, hybrid simulation can improve the authenticity of renal simulation-based education while simultaneously teaching and assessing technologically enframed caring. © 2015 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  12. A review of risk management process in construction projects of developing countries

    NASA Astrophysics Data System (ADS)

    Bahamid, R. A.; Doh, S. I.

    2017-11-01

    In the construction industry, risk management concept is a less popular technique. There are three main stages in the systematic approach to risk management in construction industry. These stages include: a) risk response; b) risk analysis and evaluation; and c) risk identification. The high risk related to construction business affects each of its participants; while operational analysis and management of construction related risks remain an enormous task to practitioners of the industry. This paper tends towards reviewing the existing literature on construction project risk managements in developing countries specifically on risk management process. The literature lacks ample risk management process approach capable of capturing risk impact on diverse project objectives. This literature review aims at discovering the frequently used techniques in risk identification and analysis. It also attempts to identify response to clarifying the different classifications of risk sources in the existing literature of developing countries, and to identify the future research directions on project risks in the area of construction in developing countries.

  13. Using an Integrated Hydrologic-Economic Model to Develop Minimum Cost Water Supply Portfolios and Manage Supply Risk

    NASA Astrophysics Data System (ADS)

    Characklis, G. W.; Ramsey, J.

    2004-12-01

    Water scarcity has become a reality in many areas as a result of population growth, fewer available sources, and reduced tolerance for the environmental impacts of developing the new supplies that do exist. As a result, successfully managing future water supply risk will become more dependent on coordinating the use of existing resources. Toward that end, flexible supply strategies that can rapidly respond to hydrologic variability will provide communities with increasing economic advantages, particularly if the frequency of more extreme events (e.g., drought) increases due to global climate change. Markets for established commodities (e.g., oil, gas) often provide a framework for efficiently responding to changes in supply and demand. Water markets, however, have remained relatively crude, with most transactions involving permanent transfers and long regulatory processes. Recently, interest in the use of flexible short-term transfers (e.g., leases, options) has begun to motivate consideration of more sophisticated strategies for managing supply risk, strategies similar to those used in more mature markets. In this case, communities can benefit from some of the advantages that water enjoys over other commodities, in particular, the ability to accurately characterize the stochastic nature of supply and demand through hydrologic modeling. Hydrologic-economic models are developed for two different water scarce regions supporting active water markets: Edward Aquifer and Lower Rio Grande Valley. These models are used to construct portfolios of water supply transfers (e.g., permanent transfers, options, and spot leases) that minimize the cost of meeting a probabilistic reliability constraint. Real and simulated spot price distributions allow each type of transfer to be priced in a manner consistent with financial theory (e.g., Black-Scholes). Market simulations are integrated with hydrologic models such that variability in supply and demand are linked with price behavior. Decisions on when and how much water to lease (or exercise, in the case of options) are made on the basis of anticipatory rules based on the ratio of expected supply to expected demand, and are used to evaluate the economic consequences of a utilityAƒAøAøâ_sA¬Aøâ_zAøs attitude toward risk. The marginal cost of supply reliability is also explored by varying the water supply reliability constraint, an important consideration as the rising expense of new source development may encourage some communities to accept a nominal number of supply shortfalls. Results demonstrate how changes in the distribution of various transfer types within a portfolio can affect its cost and reliability. Results also suggest that substantial savings can be obtained through the use of market-based risk management strategies, with optimal portfolio costs averaging as much as 35 percent less than the costs of meeting reliability targets through the maintenance of firm capacity. Both the conceptual and modeling approach described in this work are likely to have increasing application as water scarcity continues to drive the search for more efficient approaches to water resource management.

  14. A systematic review on the influence of pre-existing disability on sustaining injury.

    PubMed

    Yung, A; Haagsma, J A; Polinder, S

    2014-01-01

    To systematically review studies measuring the influence of pre-existing disability on the risk of sustaining an injury. Systematic review. Electronic databases searched included Medline (Pubmed), ProQuest, Ovid and EMBASE. Studies (1990-2010) in international peer-reviewed journals were identified with main inclusion criteria being that the study assessed involvement of injury sustained by persons with and without pre-existing disability. Studies were collated by design and methods, and evaluation of results. Twenty-two studies met the inclusion criteria of our review. All studies found that persons with disabilities were at a significantly higher risk of sustaining injuries than those without. Persons with disability had a 30-450% increased odds (odds ratio 1.3-5.5) of sustaining injury compared to persons without disability. Among persons with pre-existing disability, the high risk groups of sustaining an injury are children and elderly. People with disabilities experience a higher risk to sustain an injury in comparison to the healthy population. There is a high need for large epidemiological studies of injury among persons with disability, to better address these unique risk profiles in order to prevent additional disability or secondary conditions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Management of Cold Water-induced Hypothermia: A Simulation Scenario for Layperson Training Delivered via a Mobile Tele-simulation Unit

    PubMed Central

    Parsons, Michael

    2017-01-01

    Newfoundland and Labrador (NL) has one of the highest provincial drowning rates in Canada, largely due to the many rural communities located near bodies of water. Factor in the province’s cold climate (average NL’s freshwater temperature is below 5.4°C)and the prevalence of winter recreational activities among the population, there exists an inherent risk of ice-related injuries and subsequent hypothermia. Oftentimes, these injuries occur in remote/rural settings where immediate support from Emergency Medical Services (EMS) may not be available. During this critical period, it frequently falls on individuals without formal healthcare training to provide lifesaving measures until help arrives. Training individuals in rural communities plays an important role in ensuring public safety. In recent years, simulation-based education has become an essential tool in medical, marine and first aid training. It provides learners with a safe environment to hone their skills and has been shown to be superior to traditional clinical teaching methods. The following case aims to train laypeople from rural settings in the immediate management of an individual who becomes hypothermic following immersion into cold water. However, reaching these individuals to provide training can be a challenge in a province with such a vast geography. To assist with overcoming this, the development of a simulation center that is portable between communities (or Mobile Tele-Simulation Unit) has occurred. By utilizing modern technology, this paper also proposes an innovative method of connecting with learners in more difficult to reach regions. PMID:29503784

  16. Simulation of a Start-Up Manufacturing Facility for Nanopore Arrays

    ERIC Educational Resources Information Center

    Field, Dennis W.

    2009-01-01

    Simulation is a powerful tool in developing and troubleshooting manufacturing processes, particularly when considering process flows for manufacturing systems that do not yet exist. Simulation can bridge the gap in terms of setting up full-scale manufacturing for nanotechnology products if limited production experience is an issue. An effective…

  17. Enhancing Students' Employability through Business Simulation

    ERIC Educational Resources Information Center

    Avramenko, Alex

    2012-01-01

    Purpose: The purpose of this paper is to introduce an approach to business simulation with less dependence on business simulation software to provide innovative work experience within a programme of study, to boost students' confidence and employability. Design/methodology/approach: The paper is based on analysis of existing business simulation…

  18. FARSITE: Fire Area Simulator-model development and evaluation

    Treesearch

    Mark A. Finney

    1998-01-01

    A computer simulation model, FARSITE, includes existing fire behavior models for surface, crown, spotting, point-source fire acceleration, and fuel moisture. The model's components and assumptions are documented. Simulations were run for simple conditions that illustrate the effect of individual fire behavior models on two-dimensional fire growth.

  19. Maximum wind radius estimated by the 50 kt radius: improvement of storm surge forecasting over the western North Pacific

    NASA Astrophysics Data System (ADS)

    Takagi, Hiroshi; Wu, Wenjie

    2016-03-01

    Even though the maximum wind radius (Rmax) is an important parameter in determining the intensity and size of tropical cyclones, it has been overlooked in previous storm surge studies. This study reviews the existing estimation methods for Rmax based on central pressure or maximum wind speed. These over- or underestimate Rmax because of substantial variations in the data, although an average radius can be estimated with moderate accuracy. As an alternative, we propose an Rmax estimation method based on the radius of the 50 kt wind (R50). Data obtained by a meteorological station network in the Japanese archipelago during the passage of strong typhoons, together with the JMA typhoon best track data for 1990-2013, enabled us to derive the following simple equation, Rmax = 0.23 R50. Application to a recent strong typhoon, the 2015 Typhoon Goni, confirms that the equation provides a good estimation of Rmax, particularly when the central pressure became considerably low. Although this new method substantially improves the estimation of Rmax compared to the existing models, estimation errors are unavoidable because of fundamental uncertainties regarding the typhoon's structure or insufficient number of available typhoon data. In fact, a numerical simulation for the 2013 Typhoon Haiyan as well as 2015 Typhoon Goni demonstrates a substantial difference in the storm surge height for different Rmax. Therefore, the variability of Rmax should be taken into account in storm surge simulations (e.g., Rmax = 0.15 R50-0.35 R50), independently of the model used, to minimize the risk of over- or underestimating storm surges. The proposed method is expected to increase the predictability of major storm surges and to contribute to disaster risk management, particularly in the western North Pacific, including countries such as Japan, China, Taiwan, the Philippines, and Vietnam.

  20. FOOD RISK ANALYSIS

    USDA-ARS?s Scientific Manuscript database

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  1. Current Chemical Risk Management Activities

    EPA Pesticide Factsheets

    EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.

  2. A Framework for Assessing Uncertainty Associated with Human Health Risks from MSW Landfill Leachate Contamination.

    PubMed

    Mishra, Harshit; Karmakar, Subhankar; Kumar, Rakesh; Singh, Jitendra

    2017-07-01

    Landfilling is a cost-effective method, which makes it a widely used practice around the world, especially in developing countries. However, because of the improper management of landfills, high leachate leakage can have adverse impacts on soils, plants, groundwater, aquatic organisms, and, subsequently, human health. A comprehensive survey of the literature finds that the probabilistic quantification of uncertainty based on estimations of the human health risks due to landfill leachate contamination has rarely been reported. Hence, in the present study, the uncertainty about the human health risks from municipal solid waste landfill leachate contamination to children and adults was quantified to investigate its long-term risks by using a Monte Carlo simulation framework for selected heavy metals. The Turbhe sanitary landfill of Navi Mumbai, India, which was commissioned in the recent past, was selected to understand the fate and transport of heavy metals in leachate. A large residential area is located near the site, which makes the risk assessment problem both crucial and challenging. In this article, an integral approach in the form of a framework has been proposed to quantify the uncertainty that is intrinsic to human health risk estimation. A set of nonparametric cubic splines was fitted to identify the nonlinear seasonal trend in leachate quality parameters. LandSim 2.5, a landfill simulator, was used to simulate the landfill activities for various time slices, and further uncertainty in noncarcinogenic human health risk was estimated using a Monte Carlo simulation followed by univariate and multivariate sensitivity analyses. © 2016 Society for Risk Analysis.

  3. Serial Position and Isolation Effects in a Classroom Lecture Simulation

    ERIC Educational Resources Information Center

    Holen, Michael C.; Oaster, Thomas R.

    1976-01-01

    Provides evidence of the existence of serial position and isolation effects in a classroom lecture simulation involving extended meaningful discourse. Isolating an item facilitated learning of that item. (Author/DEP)

  4. Simulations of Emerging Magnetic Flux. II. The Formation of Unstable Coronal Flux Ropes and the Initiation of Coronal Mass Ejections

    NASA Technical Reports Server (NTRS)

    Leake, James E.; Linton, Mark G.; Antiochos, Spiro K.

    2014-01-01

    We present results from three-dimensional magnetohydrodynamic simulations of the emergence of a twisted convection zone flux tube into a pre-existing coronal dipole field. As in previous simulations, following the partial emergence of the sub-surface flux into the corona, a combination of vortical motions and internal magnetic reconnection forms a coronal flux rope. Then, in the simulations presented here, external reconnection between the emerging field and the pre-existing dipole coronal field allows further expansion of the coronal flux rope into the corona. After sufficient expansion, internal reconnection occurs beneath the coronal flux rope axis, and the flux rope erupts up to the top boundary of the simulation domain (approximately 36 Mm above the surface).We find that the presence of a pre-existing field, orientated in a direction to facilitate reconnection with the emerging field, is vital to the fast rise of the coronal flux rope. The simulations shown in this paper are able to self-consistently create many of the surface and coronal signatures used by coronal mass ejection (CME) models. These signatures include surface shearing and rotational motions, quadrupolar geometry above the surface, central sheared arcades reconnecting with oppositely orientated overlying dipole fields, the formation of coronal flux ropes underlying potential coronal field, and internal reconnection which resembles the classical flare reconnection scenario. This suggests that proposed mechanisms for the initiation of a CME, such as "magnetic breakout," are operating during the emergence of new active regions.

  5. Climate simulation and flood risk analysis for 2008-40 for Devils Lake, North Dakota

    USGS Publications Warehouse

    Vecchia, Aldo V.

    2008-01-01

    Devils Lake and Stump Lake in northeastern North Dakota receive surface runoff from a 3,810-square-mile drainage basin, and evaporation provides the only major water loss unless the lakes are above their natural spill elevation to the Sheyenne River. In September 2007, flow from Devils Lake to Stump Lake had filled Stump Lake and the two lakes consisted of essentially one water body with an elevation of 1,447.1 feet, about 3 feet below the existing base flood elevation (1,450 feet) and about 12 feet below the natural outlet elevation to the Sheyenne River (1,459 feet).Devils Lake could continue to rise, causing extensive additional flood damages in the basin and, in the event of an uncontrolled natural spill, downstream in the Red River of the North Basin. This report describes the results of a study conducted by the U.S. Geological Survey, in cooperation with the Federal Emergency Management Agency, to evaluate future flood risk for Devils Lake and provide information for developing updated flood-insurance rate maps and planning flood-mitigation activities such as raising levees or roads.In about 1980, a large, abrupt, and highly significant increase in precipitation occurred in the Devils Lake Basin and elsewhere in the Northern Great Plains, and wetter-than-normal conditions have persisted through the present (2007). Although future precipitation is impossible to predict, paleoclimatic evidence and recent research on climate dynamics indicate the current wet conditions are not likely to end anytime soon. For example, there is about a 72-percent chance wet conditions will last at least 10 more years and about a 37-percent chance wet conditions will last at least 30 more years.A stochastic simulation model for Devils Lake and Stump Lake developed in a previous study was updated and used to generate 10,000 potential future realizations, or traces, of precipitation, evaporation, inflow, and lake levels given existing conditions on September 30, 2007, and randomly generated future duration of the current wet period. On the basis of the simulations, and assuming ice-free conditions and calm wind, the Devils Lake flood elevation for an annualized flood risk of 1 percent (analogous to a “100-year” riverine flood) was estimated to be 1,454.6 feet for a 10-year time horizon (2008­­­–17). Therefore, without adjusting for wind or ice, a residence near Devils Lake at elevation 1,454.6 feet has the same chance of being flooded sometime during the next 10 years as a residence at the edge of the 100-year flood plain along a river. Adjusting for the effects of wind or ice, which will increase the flood elevations for many locations near the lakes, was not within the scope of this study.

  6. [Using sequential indicator simulation method to define risk areas of soil heavy metals in farmland.

    PubMed

    Yang, Hao; Song, Ying Qiang; Hu, Yue Ming; Chen, Fei Xiang; Zhang, Rui

    2018-05-01

    The heavy metals in soil have serious impacts on safety, ecological environment and human health due to their toxicity and accumulation. It is necessary to efficiently identify the risk area of heavy metals in farmland soil, which is of important significance for environment protection, pollution warning and farmland risk control. We collected 204 samples and analyzed the contents of seven kinds of heavy metals (Cu, Zn, Pb, Cd, Cr, As, Hg) in Zengcheng District of Guangzhou, China. In order to overcame the problems of the data, including the limitation of abnormal values and skewness distribution and the smooth effect with the traditional kriging methods, we used sequential indicator simulation method (SISIM) to define the spatial distribution of heavy metals, and combined Hakanson index method to identify potential ecological risk area of heavy metals in farmland. The results showed that: (1) Based on the similar accuracy of spatial prediction of soil heavy metals, the SISIM had a better expression of detail rebuild than ordinary kriging in small scale area. Compared to indicator kriging, the SISIM had less error rate (4.9%-17.1%) in uncertainty evaluation of heavy-metal risk identification. The SISIM had less smooth effect and was more applicable to simulate the spatial uncertainty assessment of soil heavy metals and risk identification. (2) There was no pollution in Zengcheng's farmland. Moderate potential ecological risk was found in the southern part of study area due to enterprise production, human activities, and river sediments. This study combined the sequential indicator simulation with Hakanson risk index method, and effectively overcame the outlier information loss and smooth effect of traditional kriging method. It provided a new way to identify the soil heavy metal risk area of farmland in uneven sampling.

  7. An economic analysis of poliovirus risk management policy options for 2013-2052.

    PubMed

    Duintjer Tebbens, Radboud J; Pallansch, Mark A; Cochi, Stephen L; Wassilak, Steven G F; Thompson, Kimberly M

    2015-09-24

    The Global Polio Eradication Initiative plans for coordinated cessation of oral poliovirus vaccine (OPV) after interrupting all wild poliovirus (WPV) transmission, but many questions remain related to long-term poliovirus risk management policies. We used an integrated dynamic poliovirus transmission and stochastic risk model to simulate possible futures and estimate the health and economic outcomes of maintaining the 2013 status quo of continued OPV use in most developing countries compared with OPV cessation policies with various assumptions about global inactivated poliovirus vaccine (IPV) adoption. Continued OPV use after global WPV eradication leads to continued high costs and/or high cases. Global OPV cessation comes with a high probability of at least one outbreak, which aggressive outbreak response can successfully control in most instances. A low but non-zero probability exists of uncontrolled outbreaks following a poliovirus reintroduction long after OPV cessation in a population in which IPV-alone cannot prevent poliovirus transmission. We estimate global incremental net benefits during 2013-2052 of approximately $16 billion (US$2013) for OPV cessation with at least one IPV routine immunization dose in all countries until 2024 compared to continued OPV use, although significant uncertainty remains associated with the frequency of exportations between populations and the implementation of long term risk management policies. Global OPV cessation offers the possibility of large future health and economic benefits compared to continued OPV use. Long-term poliovirus risk management interventions matter (e.g., IPV use duration, outbreak response, containment, continued surveillance, stockpile size and contents, vaccine production site requirements, potential antiviral drugs, and potential safer vaccines) and require careful consideration. Risk management activities can help to ensure a low risk of uncontrolled outbreaks and preserve or further increase the positive net benefits of OPV cessation. Important uncertainties will require more research, including characterizing immunodeficient long-term poliovirus excretor risks, containment risks, and the kinetics of outbreaks and response in an unprecedented world without widespread live poliovirus exposure.

  8. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes

    PubMed Central

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    Background A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities’ preparedness and response capabilities and to mitigate future consequences. Methods An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model’s algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. Results the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. Conclusion The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties. PMID:26959647

  9. Computational Assessment of a 3-Stage Axial Compressor Which Provides Airflow to the NASA 11- by 11-Foot Transonic Wind Tunnel, Including Design Changes for Increased Performance

    NASA Technical Reports Server (NTRS)

    Kulkarni, Sameer; Beach, Timothy A.; Jorgenson, Philip C.; Veres, Joseph P.

    2017-01-01

    A 24 foot diameter 3-stage axial compressor powered by variable-speed induction motors provides the airflow in the closed-return 11- by 11-Foot Transonic Wind Tunnel (11-Foot TWT) Facility at NASA Ames Research Center at Moffett Field, California. The facility is part of the Unitary Plan Wind Tunnel, which was completed in 1955. Since then, upgrades made to the 11-Foot TWT such as flow conditioning devices and instrumentation have increased blockage and pressure loss in the tunnel, somewhat reducing the peak Mach number capability of the test section. Due to erosion effects on the existing aluminum alloy rotor blades, fabrication of new steel rotor blades is planned. This presents an opportunity to increase the Mach number capability of the tunnel by redesigning the compressor for increased pressure ratio. Challenging design constraints exist for any proposed design, demanding the use of the existing driveline, rotor disks, stator vanes, and hub and casing flow paths, so as to minimize cost and installation time. The current effort was undertaken to characterize the performance of the existing compressor design using available design tools and computational fluid dynamics (CFD) codes and subsequently recommend a new compressor design to achieve higher pressure ratio, which directly correlates with increased test section Mach number. The constant cross-sectional area of the compressor leads to highly diffusion factors, which presents a challenge in simulating the existing design. The CFD code APNASA was used to simulate the aerodynamic performance of the existing compressor. The simulations were compared to performance predictions from the HT0300 turbomachinery design and analysis code, and to compressor performance data taken during a 1997 facility test. It was found that the CFD simulations were sensitive to endwall leakages associated with stator buttons, and to a lesser degree, under-stator-platform flow recirculation at the hub. When stator button leakages were modeled, pumping capability increased by over 20 of pressure rise at design point due to a large reduction in aerodynamic blockage at the hub. Incorporating the stator button leakages was crucial to matching test data. Under-stator-platform flow recirculation was thought to be large due to a lack of seals. The effect of this recirculation was assessed with APNASA simulations recirculating 0.5, 1, and 2 of inlet flow about stators 1 and 2, modeled as axisymmetric mass flux boundary conditions on the hub before and after the vanes. The injection of flow ahead of the stators tended to re-energize the boundary layer and reduce hub separations, resulting in about 3 increased stall margin per 1 of inlet flow recirculated. In order to assess the value of the flow recirculation, a mixing plane simulation of the compressor which gridded the under-stator cavities was generated using the ADPAC CFD code. This simulation indicated that about 0.65 of the inlet flow is recirculated around each shrouded stator. This collective information was applied during the redesign of the compressor. A potential design was identified using HT0300 which improved overall pressure ratio by removing pre-swirl into rotor 1, replacing existing NASA 65 series rotors with double circular arc sections, and re-staggering rotors and the existing stators. The performance of the new design predicted by APNASA and HT0300 is compared to the existing design.

  10. Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Malsbury, T.; Atencio, A., Jr.

    1992-01-01

    A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.

  11. Bandit: Technologies for Proximity Operations of Teams of Sub-10Kg Spacecraft

    DTIC Science & Technology

    2007-10-16

    and adding a dedicated overhead camera system. As will be explained below, the forced-air system did not work and the existing system has proven too...erratic to justify the expense of the camera system. 6DOF Software Simulator. The existing Java-based graphical 6DOF simulator was to be improved for...proposed camera system for a nonfunctional table. The C-9 final report is enclosed. ["Prf flj ,er Figure 1. Forced-air table schematic Figure 2

  12. IFEQ, app developed based on the J-SHIS system

    NASA Astrophysics Data System (ADS)

    Azuma, H.; Hao, K. X.; Fujiwara, H.

    2015-12-01

    Raise an awareness of earthquake disaster prevention is an important issue in Japan. To do that, we have developed an app IFEQ on smartphone based on the APIs provided by the J-SHIS that is an integrated system of Seismic Hazard Assessment. IFEQ can simulate a real situation of earthquake disaster you might experience at current spot. The hint of the IFEQ came from a question "What should I do IF a big EarthQuake hit now?" An earthquake risk situation is estimated by a location information that is acquired from GPS, the detail comprehensive information of a 250m mesh obtained through J-SHIS APIs by the geomorphological classification and a probability of intensity 6 lower (JMA) within 30 years. A user's photo was displayed with a risk situation surrounded by simply one touch. IFEQ helps oneself to overcome a gap between exist scenes and a horrible disaster by enhancing imagination. The results show people have more ideas to handle with the risk situation after using the App IFEQ. IFEQ App's features are summarized as below: 1. Visualizing an image photo with possible risks from a coming earthquake at present spot. 2. Displaying an exceedance probability within 30-years and a maximum seismic intensity within 10,000-years on present location. 3. User can obtain some advices of how to prepare for possible risks. 4. A risk with five ranks is given, especially for items of Building Collapse, Liquefaction and Landslide. IFEQ can be downloaded freely from http://www.j-shis.bosai.go.jp/app-ifearthquake J-SHIS APIs can be obtained from http://www.j-shis.bosai.go.jp/en/category/opencat/api

  13. A probabilistic storm surge risk model for the German North Sea and Baltic Sea coast

    NASA Astrophysics Data System (ADS)

    Grabbert, Jan-Henrik; Reiner, Andreas; Deepen, Jan; Rodda, Harvey; Mai, Stephan; Pfeifer, Dietmar

    2010-05-01

    The German North Sea coast is highly exposed to storm surges. Due to its concave bay-like shape mainly orientated to the North-West, cyclones from Western, North-Western and Northern directions together with astronomical tide cause storm surges accumulating the water in the German bight. Due to the existence of widespread low-lying areas (below 5m above mean sea level) behind the defenses, large areas including large economic values are exposed to coastal flooding including cities like Hamburg or Bremen. The occurrence of extreme storm surges in the past like e.g. in 1962 taking about 300 lives and causing widespread flooding and 1976 raised the awareness and led to a redesign of the coastal defenses which provide a good level of protection for today's conditions. Never the less the risk of flooding exists. Moreover an amplification of storm surge risk can be expected under the influence of climate change. The Baltic Sea coast is also exposed to storm surges, which are caused by other meteorological patterns. The influence of the astronomical tide is quite low instead high water levels are induced by strong winds only. Since the exceptional extreme event in 1872 storm surge hazard has been more or less forgotten. Although such an event is very unlikely to happen, it is not impossible. Storm surge risk is currently (almost) non-insurable in Germany. The potential risk is difficult to quantify as there are almost no historical losses available. Also premiums are difficult to assess. Therefore a new storm surge risk model is being developed to provide a basis for a probabilistic quantification of potential losses from coastal inundation. The model is funded by the GDV (German Insurance Association) and is planned to be used within the German insurance sector. Results might be used for a discussion of insurance cover for storm surge. The model consists of a probabilistic event driven hazard and a vulnerability module, furthermore an exposure interface and a financial module to account for specific (re-) insurance conditions. This contribution will mainly concentrate on the hazard module. The hazard is covered by an event simulation engine enabling Monte Carlo simulations. The event generation is done on-the-fly. A classification of historical storm surges is used based on observed sea water levels at gauging stations and extended literature research. To characterize the origin of storm events and storm surges caused by those, also meteorological parameters like wind speed and wind direction are being used. If high water levels along the coast are mainly caused by strong wind from particular directions as observed at the North Sea, there is a clear empirical relationship between wind and surge (where surge is defined as the wind-driven component of the sea water level) which can be described by the ATWS (Average Transformed Wind speed). The parameters forming the load at the coastal defense elements are water level and wave parameters like significant wave height, wave period and wave direction. To assess the wave characteristics at the coast the numerical model SWAN (Simulating Waves Near Shore) from TU Delft has been used. To account for different probabilities of failure and inundation the coast is split into segments with similar defense characteristics like type of defense, height, width, orientation and others. The chosen approach covers the most relevant failure mechanisms for coastal dikes induced by wave overtopping and overflow. Dune failure is also considered in the model. Inundation of the hinterland after defense failure is modeled using a simple dynamical 2d-approach resulting in distributed water depths and flood outlines for each segment. Losses can be estimated depending on the input exposure data either coordinate based for single buildings or aggregated on postal code level using a set of depths-damage functions.

  14. Tsunami risk mapping simulation for Malaysia

    USGS Publications Warehouse

    Teh, S.Y.; Koh, H. L.; Moh, Y.T.; De Angelis, D. L.; Jiang, J.

    2011-01-01

    The 26 December 2004 Andaman mega tsunami killed about a quarter of a million people worldwide. Since then several significant tsunamis have recurred in this region, including the most recent 25 October 2010 Mentawai tsunami. These tsunamis grimly remind us of the devastating destruction that a tsunami might inflict on the affected coastal communities. There is evidence that tsunamis of similar or higher magnitudes might occur again in the near future in this region. Of particular concern to Malaysia are tsunamigenic earthquakes occurring along the northern part of the Sunda Trench. Further, the Manila Trench in the South China Sea has been identified as another source of potential tsunamigenic earthquakes that might trigger large tsunamis. To protect coastal communities that might be affected by future tsunamis, an effective early warning system must be properly installed and maintained to provide adequate time for residents to be evacuated from risk zones. Affected communities must be prepared and educated in advance regarding tsunami risk zones, evacuation routes as well as an effective evacuation procedure that must be taken during a tsunami occurrence. For these purposes, tsunami risk zones must be identified and classified according to the levels of risk simulated. This paper presents an analysis of tsunami simulations for the South China Sea and the Andaman Sea for the purpose of developing a tsunami risk zone classification map for Malaysia based upon simulated maximum wave heights. ?? 2011 WIT Press.

  15. Wildfire risk assessment in a typical Mediterranean wildland-urban interface of Greece.

    PubMed

    Mitsopoulos, Ioannis; Mallinis, Giorgos; Arianoutsou, Margarita

    2015-04-01

    The purpose of this study was to assess spatial wildfire risk in a typical Mediterranean wildland-urban interface (WUI) in Greece and the potential effect of three different burning condition scenarios on the following four major wildfire risk components: burn probability, conditional flame length, fire size, and source-sink ratio. We applied the Minimum Travel Time fire simulation algorithm using the FlamMap and ArcFuels tools to characterize the potential response of the wildfire risk to a range of different burning scenarios. We created site-specific fuel models of the study area by measuring the field fuel parameters in representative natural fuel complexes, and we determined the spatial extent of the different fuel types and residential structures in the study area using photointerpretation procedures of large scale natural color orthophotographs. The results included simulated spatially explicit fire risk components along with wildfire risk exposure analysis and the expected net value change. Statistical significance differences in simulation outputs between the scenarios were obtained using Tukey's significance test. The results of this study provide valuable information for decision support systems for short-term predictions of wildfire risk potential and inform wildland fire management of typical WUI areas in Greece.

  16. Wildfire Risk Assessment in a Typical Mediterranean Wildland-Urban Interface of Greece

    NASA Astrophysics Data System (ADS)

    Mitsopoulos, Ioannis; Mallinis, Giorgos; Arianoutsou, Margarita

    2015-04-01

    The purpose of this study was to assess spatial wildfire risk in a typical Mediterranean wildland-urban interface (WUI) in Greece and the potential effect of three different burning condition scenarios on the following four major wildfire risk components: burn probability, conditional flame length, fire size, and source-sink ratio. We applied the Minimum Travel Time fire simulation algorithm using the FlamMap and ArcFuels tools to characterize the potential response of the wildfire risk to a range of different burning scenarios. We created site-specific fuel models of the study area by measuring the field fuel parameters in representative natural fuel complexes, and we determined the spatial extent of the different fuel types and residential structures in the study area using photointerpretation procedures of large scale natural color orthophotographs. The results included simulated spatially explicit fire risk components along with wildfire risk exposure analysis and the expected net value change. Statistical significance differences in simulation outputs between the scenarios were obtained using Tukey's significance test. The results of this study provide valuable information for decision support systems for short-term predictions of wildfire risk potential and inform wildland fire management of typical WUI areas in Greece.

  17. The use of psychiatry-focused simulation in undergraduate nursing education: A systematic search and review.

    PubMed

    Vandyk, Amanda D; Lalonde, Michelle; Merali, Sabrina; Wright, Erica; Bajnok, Irmajean; Davies, Barbara

    2018-04-01

    Evidence on the use of simulation to teach psychiatry and mental health (including addiction) content is emerging, yet no summary of the implementation processes or associated outcomes exists. The aim of this study was to systematically search and review empirical literature on the use of psychiatry-focused simulation in undergraduate nursing education. Objectives were to (i) assess the methodological quality of existing evidence on the use of simulation to teach mental health content to undergraduate nursing students, (ii) describe the operationalization of the simulations, and (iii) summarize the associated quantitative and qualitative outcomes. We conducted online database (MEDLINE, Embase, ERIC, CINAHL, PsycINFO from January 2004 to October 2015) and grey literature searches. Thirty-two simulation studies were identified describing and evaluating six types of simulations (standardized patients, audio simulations, high-fidelity simulators, virtual world, multimodal, and tabletop). Overall, 2724 participants were included in the studies. Studies reflected a limited number of intervention designs, and outcomes were evaluated with qualitative and quantitative methods incorporating a variety of tools. Results indicated that simulation was effective in reducing student anxiety and improving their knowledge, empathy, communication, and confidence. The summarized qualitative findings all supported the benefit of simulation; however, more research is needed to assess the comparative effectiveness of the types of simulations. Recommendations from the findings include the development of guidelines for educators to deliver each simulation component (briefing, active simulation, debriefing). Finally, consensus around appropriate training of facilitators is needed, as is consistent and agreed upon simulation terminology. © 2017 Australian College of Mental Health Nurses Inc.

  18. 75 FR 55619 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-13

    ... Numerical Simulations Risk Management Methodology September 7, 2010. Pursuant to Section 19(b)(1) of the... for incorporation in the System for Theoretical Analysis and Numerical Simulations (``STANS'') risk... ETFs \\3\\ in the STANS margin calculation process.\\4\\ When OCC began including common stock and ETFs in...

  19. A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models.

    EPA Science Inventory

    BackgroundExposure measurement error in copollutant epidemiologic models has the potential to introduce bias in relative risk (RR) estimates. A simulation study was conducted using empirical data to quantify the impact of correlated measurement errors in time-series analyses of a...

  20. Predictive genetic testing for the identification of high-risk groups: a simulation study on the impact of predictive ability

    PubMed Central

    2011-01-01

    Background Genetic risk models could potentially be useful in identifying high-risk groups for the prevention of complex diseases. We investigated the performance of this risk stratification strategy by examining epidemiological parameters that impact the predictive ability of risk models. Methods We assessed sensitivity, specificity, and positive and negative predictive value for all possible risk thresholds that can define high-risk groups and investigated how these measures depend on the frequency of disease in the population, the frequency of the high-risk group, and the discriminative accuracy of the risk model, as assessed by the area under the receiver-operating characteristic curve (AUC). In a simulation study, we modeled genetic risk scores of 50 genes with equal odds ratios and genotype frequencies, and varied the odds ratios and the disease frequency across scenarios. We also performed a simulation of age-related macular degeneration risk prediction based on published odds ratios and frequencies for six genetic risk variants. Results We show that when the frequency of the high-risk group was lower than the disease frequency, positive predictive value increased with the AUC but sensitivity remained low. When the frequency of the high-risk group was higher than the disease frequency, sensitivity was high but positive predictive value remained low. When both frequencies were equal, both positive predictive value and sensitivity increased with increasing AUC, but higher AUC was needed to maximize both measures. Conclusions The performance of risk stratification is strongly determined by the frequency of the high-risk group relative to the frequency of disease in the population. The identification of high-risk groups with appreciable combinations of sensitivity and positive predictive value requires higher AUC. PMID:21797996

  1. Innovative neuro-fuzzy system of smart transport infrastructure for road traffic safety

    NASA Astrophysics Data System (ADS)

    Beinarovica, Anna; Gorobetz, Mikhail; Levchenkov, Anatoly

    2017-09-01

    The proposed study describes applying of neural network and fuzzy logic in transport control for safety improvement by evaluation of accidents’ risk by intelligent infrastructure devices. Risk evaluation is made by following multiple-criteria: danger, changeability and influence of changes for risk increasing. Neuro-fuzzy algorithms are described and proposed for task solution. The novelty of the proposed system is proved by deep analysis of known studies in the field. The structure of neuro-fuzzy system for risk evaluation and mathematical model is described in the paper. The simulation model of the intelligent devices for transport infrastructure is proposed to simulate different situations, assess the risks and propose the possible actions for infrastructure or vehicles to minimize the risk of possible accidents.

  2. Estimating the risks for adverse effects of total phosphorus in receiving streams with the Stochastic Empirical Loading and Dilution Model (SELDM)

    USGS Publications Warehouse

    Granato, Gregory E.; Jones, Susan C.

    2015-01-01

    Results of this study indicate the potential benefits of the multi-decade simulations that SELDM provides because these simulations quantify risks and uncertainties that affect decisions made with available data and statistics. Results of the SELDM simulations indicate that the WQABI criteria concentrations may be too stringent for evaluating the stormwater quality in receiving streams, highway runoff, and BMP discharges; especially with the substantial uncertainties inherent in selecting representative data.

  3. Adding the Human Element to Ship Manoeuvring Simulations

    NASA Astrophysics Data System (ADS)

    Aarsæther, Karl Gunnar; Moan, Torgeir

    Time-domain simulation of ship manoeuvring has been utilized in risk analysis to assess the effect of changes to the ship-lane, development in traffic volume and the associated risk. The process of ship manoeuvring in a wider socio-technical context consists of the technical systems, operational procedures, the human operators and support functions. Automated manoeuvring simulations without human operators in the simulation loop have often been preferred in simulation studies due to the low time required for simulations. Automatic control has represented the human element with little effort devoted to explain the relationship between the guidance and control algorithms and the human operator which they replace. This paper describes the development and application of a model for the human element for autonomous time-domain manoeuvring simulations. The method is applicable in the time-domain, modular and found to be capable of reproducing observed manoeuvre patterns, but limited to represent the intended behaviour.

  4. Defining Sexuality among Female Black Inner-City Young Adolescents.

    ERIC Educational Resources Information Center

    Gershenson, Harold P.; Handler, Arden

    Adolescents are able to respond correctly to questions about pregnancy risk and contraceptive use, yet still engage in risk-taking behavior. One explanation for this phenomenon may be the existence of a personal fable. To explore the existence of the personal fable in inner-city female adolescents, 22 eighth grade black females in Chicago…

  5. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    The increasing ship traffic and maritime transport of dangerous substances make it more difficult to significantly reduce the environmental, economic and social risks posed by potential spills, although the security rules are becoming more restrictive (ships with double hull, etc.) and the surveillance systems are becoming more developed (VTS, AIS). In fact, the problematic associated to spills is and will always be a main topic: spill events are continuously happening, most of them unknown for the general public because of their small scale impact, but with some of them (in a much smaller number) becoming authentic media phenomena in this information era, due to their large dimensions and environmental and social-economic impacts on ecosystems and local communities, and also due to some spectacular or shocking pictures generated. Hence, the adverse consequences posed by these type of accidents, increase the preoccupation of avoiding them in the future, or minimize their impacts, using not only surveillance and monitoring tools, but also increasing the capacity to predict the fate and behaviour of bodies, objects, or substances in the following hours after the accident - numerical models can have now a leading role in operational oceanography applied to safety and pollution response in the ocean because of their predictive potential. Search and rescue operation, oil, inert (ship debris, or floating containers), and HNS (hazardous and noxious substances) spills risk analysis are the main areas where models can be used. Model applications have been widely used in emergency or planning issues associated to pollution risks, and contingency and mitigation measures. Before a spill, in the planning stage, modelling simulations are used in environmental impact studies, or risk maps, using historical data, reference situations, and typical scenarios. After a spill, the use of fast and simple modelling applications allow to understand the fate and behaviour of the spilt substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very useful in emergency situations. The backtracking modelling feature and the possibility of importing spill locations from remote servers with observed data (per example, from flight surveillance or remote sensing) allow the potential application to the evaluation of possible contamination sources. The third tool developed is an innovative system to dynamically produce quantified risk levels in real time, integrating best available information from numerical forecasts and the existing monitoring tools. This system provides coastal pollution risk levels associated to potential (or real) oil spill incidents, taking into account regional statistic information on vessel accidents and coastal sensitivity indexes (determined in EROCIPS project), real time vessel information (positioning, cargo type, speed and vessel type) obtained from AIS, best-available metocean numerical forecasts (hydrodynamics, meteorology - including visibility, wave conditions) and simulated scenarios by the oil spill fate and behaviour component of MOHID Water Modelling System (www.mohid.com). Different spill fate and behaviour simulations are continuously generated and processed in background (assuming hypothetical spills from vessels), based on variable vessel information, and metocean conditions, and results from these simulations are used in the quantification the consequences of potential spills. Dynamic Risk Tool was not designed to replace conventional mapping tools, but to complement that type of information with an innovative approach to risk mapping. Taking advantage of interoperability between forecasting models, oil spill simulations, AIS monitoring systems, statistical data and coastal vulnerability, this software can provide to end-users realtime risk levels, allowing an innovative approach to risk mapping, providing decision-makers with an improved decision support model and also an intelligent risk-based traffic monitoring. For instance, this tool allows the prioritisation of individual ships and geographical areas, and facilitates strategic and dynamic tug positioning. As referred, the risk levels are generated in realtime, and the historic results are kept in a database, allowing later risk analysis or compilations for specific seasons or regions, in order to obtain typical risk maps, etc. The integration with metocean modeling results (instead of using typical static scenarios), as well as continuous background oil spill modelling, provide a more realistic approach to the estimation of risk levels - the metocean conditions and oil spill behaviour are always different and specific, and it's virtually impossible to previously define those conditions even if several thousands of static scenarios were previously considered. This system was initially implemented in Portugal (ARCOPOL project) for oil spills. The implementation at different regions in the Atlantic and the adaptation to chemical spills will be executed in the scope of ARCOPOL+ project. The numerical model used for computing the fate and behaviour of spilled substances in all the tools developed (MOHID lagrangian & oil spill model from MOHID Water modelling System) was also subject of several adaptations and updates, in order to increase its adaptability to the developed tools - horizontal velocity due to Stokes Drift, vertical movement of oil substances, modelling of floating containers, backtracking modelling and a multi-solution approach (generating computational grid on-the-fly, and using the available information from the multiple metocean forecasting solutions available) are some of the main features recently implemented. The main purpose of these software tools are mainly to reduce the gap between the decision-makers and scientific modellers - although the correct analysis of model results usually requires a specialist, an operational model user should not loose most of the time converting and interpolating metocean results, preparing input data files, running models and post-processing results - rather than analysing results and producing different scenarios. The harmonization and standardization in respect to dissemination of numerical model outputs is a strategic effort for the modelling scientific community, because facilitates the application of their results in decision-support tools like the ones presented here.

  6. Increased Susceptibility to Chemical Toxicity with (Pre-existing ...

    EPA Pesticide Factsheets

    Numerous host and environmental factors may modulate vulnerability and risk. An area of increasing interest to risk assessors is the potential for chemicals to interact with pre-existing diseases and aging that may yield cumulative damage, altered chemical response, and increased disease susceptibility. We evaluated the relationships between chemicals and pre-existing disease and identified the type of information needed to evaluate the relationships of interest. This is for presentation at the 54th Society of Toxicology Annual Meeting and ToxExpo 2015.

  7. Modelling and observing the role of wind in Anopheles population dynamics around a reservoir.

    PubMed

    Endo, Noriko; Eltahir, Elfatih A B

    2018-01-25

    Wind conditions, as well as other environmental conditions, are likely to influence malaria transmission through the behaviours of Anopheles mosquitoes, especially around water-resource reservoirs. Wind-induced waves in a reservoir impose mortality on aquatic-stage mosquitoes. Mosquitoes' host-seeking activity is also influenced by wind through dispersion of [Formula: see text]. However, no malaria transmission model exists to date that simulated those impacts of wind mechanistically. A modelling framework for simulating the three important effects of wind on the behaviours of mosquito is developed: attraction of adult mosquitoes through dispersion of [Formula: see text] ([Formula: see text] attraction), advection of adult mosquitoes (advection), and aquatic-stage mortality due to wind-induced surface waves (waves). The framework was incorporated in a mechanistic malaria transmission simulator, HYDREMATS. The performance of the extended simulator was compared with the observed population dynamics of the Anopheles mosquitoes at a village adjacent to the Koka Reservoir in Ethiopia. The observed population dynamics of the Anopheles mosquitoes were reproduced with some reasonable accuracy in HYDREMATS that includes the representation of the wind effects. HYDREMATS without the wind model failed to do so. Offshore wind explained the increase in Anopheles population that cannot be expected from other environmental conditions alone. Around large water bodies such as reservoirs, the role of wind in the dynamics of Anopheles population, hence in malaria transmission, can be significant. Modelling the impacts of wind on the behaviours of Anopheles mosquitoes aids in reproducing the seasonality of malaria transmission and in estimation of the risk of malaria around reservoirs.

  8. Application of Bayesian geostatistics for evaluation of mass discharge uncertainty at contaminated sites

    NASA Astrophysics Data System (ADS)

    Troldborg, Mads; Nowak, Wolfgang; Lange, Ida V.; Santos, Marta C.; Binning, Philip J.; Bjerg, Poul L.

    2012-09-01

    Mass discharge estimates are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Such estimates are, however, rather uncertain as they integrate uncertain spatial distributions of both concentration and groundwater flow. Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty, and (3) uncertain source zone and transport parameters. The method generates conditional realizations of the spatial flow and concentration distribution. An analytical macrodispersive transport solution is employed to simulate the mean concentration distribution, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. The method has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is demonstrated on a field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the cosimulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.

  9. Effect of Variable Manning Coefficients on Tsunami Inundation

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Rees, D.

    2017-12-01

    Numerical simulations are commonly used to help estimate tsunami hazard, improve evacuation plans, issue or cancel tsunami warnings, inform forecasting and hazard assessments and have therefore become an integral part of hazard mitigation among the tsunami community. Many numerical codes exist for simulating tsunamis, most of which have undergone extensive benchmarking and testing. Tsunami hazard or risk assessments employ these codes following a deterministic or probabilistic approach. Depending on the scope these studies may or may not consider uncertainty in the numerical simulations, the effects of tides, variable friction or estimate financial losses, none of which are necessarily trivial. Distributed manning coefficients, the roughness coefficients used in hydraulic modeling, are commonly used in simulating both riverine and pluvial flood events however, their use in tsunami hazard assessments is primarily part of limited scope studies and for the most part, not a standard practice. For this work, we investigate variations in manning coefficients and their effects on tsunami inundation extent, pattern and financial loss. To assign manning coefficients we use land use maps that come from the New Zealand Land Cover Database (LCDB) and more recent data from the Ministry of the Environment. More than 40 classes covering different types of land use are combined into major classes such as cropland, grassland and wetland representing common types of land use in New Zealand, each of which is assigned a unique manning coefficient. By utilizing different data sources for variable manning coefficients, we examine the impact of data sources and classification methodology on the accuracy of model outputs.

  10. Global risk factor rankings: the importance of age-based health loss inequities caused by alcohol and other risk factors.

    PubMed

    Shield, Kevin D; Rehm, Jürgen

    2015-06-09

    Achieving health equity is a priority of the World Health Organization; however, there is a scant amount of literature on this topic. As the underlying influences that determine health loss caused by risk factors are age-dependent, the aim of this paper is to examine how the risk factor rankings for health loss differ by age. Rankings were based on data obtained from the 2010 Global Burden of Disease study. Health loss (as measured by Disability Adjusted Life Years lost) by risk factor was estimated using Population-Attributable Fractions, years of life lost due to premature mortality, and years lived with disability, which were calculated for 187 countries, 20 age groups and both sexes. Uncertainties of the risk factor rankings were estimated using 1,000 simulations taken from posterior distributions The top risk factors by age were: household air pollution for neonates 0-6 days of age [95% uncertainty interval (UI): 1 to 1]; suboptimal breast feeding for children 7-27 days of age (95% UI: 1-1); childhood underweight for children 28 days to less than 1 year of age and 1-4 years of age (95% UI: 1-2 and 1-1, respectively); iron deficiency for children and youth 5-14 years of age (95% UI: 1-1); alcohol use for people 15-49 years of age (95% UI: 1-2); and dietary risks for people 50 years of age and older (95% UI: 1-1). Rankings of risk factors varied by sex among the older age groups. Alcohol and smoking were the most important risk factors among men 15 years of age and older, and high body mass and intimate partner violence were some of the most important risk factors among women 15 years of age and older. Our analyses confirm that the relative importance of risk factors is age-dependent. Therefore, preventing harms caused by various modifiable risk factors using interventions that target people of different ages should be a priority, especially since easily implemented and cost-effective public health interventions exist.

  11. A Comparative Study on Real Lab and Simulation Lab in Communication Engineering from Students' Perspectives

    ERIC Educational Resources Information Center

    Balakrishnan, B.; Woods, P. C.

    2013-01-01

    Over the years, rapid development in computer technology has engendered simulation-based laboratory (lab) in addition to the traditional hands-on (physical) lab. Many higher education institutions adopt simulation lab, replacing some existing physical lab experiments. The creation of new systems for conducting engineering lab activities has raised…

  12. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  13. Technology survey of computer software as applicable to the MIUS project

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  14. Flood Impacts on People: from Hazard to Risk Maps

    NASA Astrophysics Data System (ADS)

    Arrighi, C.; Castelli, F.

    2017-12-01

    The mitigation of adverse consequences of floods on people is crucial for civil protection and public authorities. According to several studies, in the developed countries the majority of flood-related fatalities occurs due to inappropriate high risk behaviours such as driving and walking in floodwaters. In this work both the loss of stability of vehicles and pedestrians in floodwaters are analysed. Flood hazard is evaluated, based on (i) a 2D inundation model of an urban area, (ii) 3D hydrodynamic simulations of water flows around vehicles and human body and (iii) a dimensional analysis of experimental activity. Exposure and vulnerability of vehicles and population are assessed exploiting several sources of open GIS data in order to produce risk maps for a testing case study. The results show that a significant hazard to vehicles and pedestrians exists in the study area. Particularly high is the hazard to vehicles, which are likely to be swept away by flood flow, possibly aggravate damages to structures and infrastructures and locally alter the flood propagation. Exposure and vulnerability analysis identifies some structures such as schools and public facilities, which may attract several people. Moreover, some shopping facilities in the area, which attract both vehicular and pedestrians' circulation are located in the highest flood hazard zone.The application of the method demonstrates that, at municipal level, such risk maps can support civil defence strategies and education to active citizenship, thus contributing to flood impact reduction to population.

  15. Water shortage risk assessment considering large-scale regional transfers: a copula-based uncertainty case study in Lunan, China.

    PubMed

    Gao, Xueping; Liu, Yinzhu; Sun, Bowen

    2018-06-05

    The risk of water shortage caused by uncertainties, such as frequent drought, varied precipitation, multiple water resources, and different water demands, brings new challenges to the water transfer projects. Uncertainties exist for transferring water and local surface water; therefore, the relationship between them should be thoroughly studied to prevent water shortage. For more effective water management, an uncertainty-based water shortage risk assessment model (UWSRAM) is developed to study the combined effect of multiple water resources and analyze the shortage degree under uncertainty. The UWSRAM combines copula-based Monte Carlo stochastic simulation and the chance-constrained programming-stochastic multiobjective optimization model, using the Lunan water-receiving area in China as an example. Statistical copula functions are employed to estimate the joint probability of available transferring water and local surface water and sampling from the multivariate probability distribution, which are used as inputs for the optimization model. The approach reveals the distribution of water shortage and is able to emphasize the importance of improving and updating transferring water and local surface water management, and examine their combined influence on water shortage risk assessment. The possible available water and shortages can be calculated applying the UWSRAM, also with the corresponding allocation measures under different water availability levels and violating probabilities. The UWSRAM is valuable for mastering the overall multi-water resource and water shortage degree, adapting to the uncertainty surrounding water resources, establishing effective water resource planning policies for managers and achieving sustainable development.

  16. The Distributed Diagonal Force Decomposition Method for Parallelizing Molecular Dynamics Simulations

    PubMed Central

    Boršnik, Urban; Miller, Benjamin T.; Brooks, Bernard R.; Janežič, Dušanka

    2011-01-01

    Parallelization is an effective way to reduce the computational time needed for molecular dynamics simulations. We describe a new parallelization method, the distributed-diagonal force decomposition method, with which we extend and improve the existing force decomposition methods. Our new method requires less data communication during molecular dynamics simulations than replicated data and current force decomposition methods, increasing the parallel efficiency. It also dynamically load-balances the processors' computational load throughout the simulation. The method is readily implemented in existing molecular dynamics codes and it has been incorporated into the CHARMM program, allowing its immediate use in conjunction with the many molecular dynamics simulation techniques that are already present in the program. We also present the design of the Force Decomposition Machine, a cluster of personal computers and networks that is tailored to running molecular dynamics simulations using the distributed diagonal force decomposition method. The design is expandable and provides various degrees of fault resilience. This approach is easily adaptable to computers with Graphics Processing Units because it is independent of the processor type being used. PMID:21793007

  17. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  18. Human Factors Guidelines for UAS in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Hobbs, Alan; Shively, R. Jay

    2013-01-01

    The ground control stations (GCS) of some UAS have been characterized by less-than-adequate human-system interfaces. In some cases this may reflect a failure to apply an existing regulation or human factors standard. In other cases, the problem may indicate a lack of suitable guidance material. NASA is leading a community effort to develop recommendations for human factors guidelines for GCS to support routine beyond-line-of-sight UAS operations in the national airspace system (NAS). In contrast to regulations, guidelines are not mandatory requirements. However, by encapsulating solutions to identified problems or areas of risk, guidelines can provide assistance to system developers, users and regulatory agencies. To be effective, guidelines must be relevant to a wide range of systems, must not be overly prescriptive, and must not impose premature standardization on evolving technologies. By assuming that a pilot will be responsible for each UAS operating in the NAS, and that the aircraft will be required to operate in a manner comparable to conventionally piloted aircraft, it is possible to identify a generic set of pilot tasks and the information, control and communication requirements needed to support these tasks. Areas where guidelines will be useful can then be identified, utilizing information from simulations, operational experience and the human factors literature. In developing guidelines, we recognize that existing regulatory and guidance material will, at times, provide adequate coverage of an area. In other cases suitable guidelines may be found in existing military or industry human factors standards. In cases where appropriate existing standards cannot be identified, original guidelines will be proposed.

  19. Two-Dimensional Hydrodynamic Modeling and Analysis of the Proposed Channel Modifications and Grade Control Structure on the Blue River near Byram's Ford Industrial Park, Kansas City, Missouri

    USGS Publications Warehouse

    Huizinga, Richard J.

    2007-01-01

    The Blue River Channel Modification project being implemented by the U.S. Army Corps of Engineers (USACE) is intended to provide flood protection within the Blue River valley in the Kansas City, Mo., metropolitan area. In the latest phase of the project, concerns have arisen about preserving the Civil War historic area of Byram's Ford and the associated Big Blue Battlefield while providing flood protection for the Byram's Ford Industrial Park. In 1996, the USACE used a physical model built at the Waterways Experiment Station (WES) in Vicksburg, Miss., to examine the feasibility of a proposed grade control structure (GCS) that would be placed downstream from the historic river crossing of Byram's Ford to provide a subtle transition of flow from the natural channel to the modified channel. The U.S. Geological Survey (USGS), in cooperation with the USACE, modified an existing two-dimensional finite element surface-water model of the river between 63d Street and Blue Parkway (the 'original model'), used the modified model to simulate the existing (as of 2006) unimproved channel and the proposed channel modifications and GCS, and analyzed the results from the simulations and those from the WES physical model. Modifications were made to the original model to create a model that represents existing (2006) conditions between the north end of Swope Park immediately upstream from 63d Street and the upstream limit of channel improvement on the Blue River (the 'model of existing conditions'). The model of existing conditions was calibrated to two measured floods. The model of existing conditions also was modified to create a model that represents conditions along the same reach of the Blue River with proposed channel modifications and the proposed GCS (the 'model of proposed conditions'). The models of existing conditions and proposed conditions were used to simulate the 30-, 50-, and 100-year recurrence floods. The discharge from the calibration flood of May 15, 1990, also was simulated in the models of existing and proposed conditions to provide results for that flood with the current downstream channel modifications and with the proposed channel modifications and GCS. Results from the model of existing conditions show that the downstream channel modifications as they exist (2006) may already be affecting flows in the unmodified upstream channel. The 30-year flood does not inundate most of the Byram's Ford Industrial Park near the upstream end of the study area. Analysis of the 1990 flood (with the historical 1990 channel conditions) and the 1990 flood simulated with the existing (2006) conditions indicates a substantial increase in velocity throughout the study area and a substantial decrease in inundated area from 1990 to 2006. Results from the model of proposed conditions show that the proposed channel modifications will contain the 30-year flood and that the spoil berm designed to provide additional flood protection for the Byram's Ford Industrial Park for the 30-year flood prevents inundation of the industrial park. In the vicinity of Byram's Ford for the 30-year flood, the maximum depth increased from 39.7 feet (ft) in the model of existing conditions to 43.5 ft in the model of proposed conditions, with a resulting decrease in velocity from 6.61 to 4.55 feet per second (ft/s). For the 50-year flood, the maximum depth increased from 42.3 to 45.8 ft, with a decrease in velocity from 6.12 to 4.16 ft/s from existing to proposed conditions. For the 100-year flood, the maximum depth increased from 44.0 to 46.6 ft, with a decrease in velocity from 5.64 to 4.12 ft/s from existing to proposed conditions. When the May 15, 1990, discharge is simulated in the model of existing conditions (with the existing (2006) modified channel downstream of the study area), the maximum depth increases from 38.4 to 42.0 ft, with a decrease in velocity from 6.54 to 4.84 ft/s from existing (2006) to proposed conditions. Analysis of the results fro

  20. Constructing a framework for risk analyses of climate change effects on the water budget of differently sloped vineyards with a numeric simulation using the Monte Carlo method coupled to a water balance model

    PubMed Central

    Hofmann, Marco; Lux, Robert; Schultz, Hans R.

    2014-01-01

    Grapes for wine production are a highly climate sensitive crop and vineyard water budget is a decisive factor in quality formation. In order to conduct risk assessments for climate change effects in viticulture models are needed which can be applied to complete growing regions. We first modified an existing simplified geometric vineyard model of radiation interception and resulting water use to incorporate numerical Monte Carlo simulations and the physical aspects of radiation interactions between canopy and vineyard slope and azimuth. We then used four regional climate models to assess for possible effects on the water budget of selected vineyard sites up 2100. The model was developed to describe the partitioning of short-wave radiation between grapevine canopy and soil surface, respectively, green cover, necessary to calculate vineyard evapotranspiration. Soil water storage was allocated to two sub reservoirs. The model was adopted for steep slope vineyards based on coordinate transformation and validated against measurements of grapevine sap flow and soil water content determined down to 1.6 m depth at three different sites over 2 years. The results showed good agreement of modeled and observed soil water dynamics of vineyards with large variations in site specific soil water holding capacity (SWC) and viticultural management. Simulated sap flow was in overall good agreement with measured sap flow but site-specific responses of sap flow to potential evapotranspiration were observed. The analyses of climate change impacts on vineyard water budget demonstrated the importance of site-specific assessment due to natural variations in SWC. The improved model was capable of describing seasonal and site-specific dynamics in soil water content and could be used in an amended version to estimate changes in the water budget of entire grape growing areas due to evolving climatic changes. PMID:25540646

  1. Technical Highlight: NREL Improves Building Energy Simulation Programs Through Diagnostic Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, B.

    2012-01-09

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market.

  2. Feasibility and concept study to convert the NASA/AMES vertical motion simulator to a helicopter simulator

    NASA Technical Reports Server (NTRS)

    Belsterling, C. A.; Chou, R. C.; Davies, E. G.; Tsui, K. C.

    1978-01-01

    The conceptual design for converting the vertical motion simulator (VMS) to a multi-purpose aircraft and helicopter simulator is presented. A unique, high performance four degrees of freedom (DOF) motion system was developed to permanently replace the present six DOF synergistic system. The new four DOF system has the following outstanding features: (1) will integrate with the two large VMS translational modes and their associated subsystems; (2) can be converted from helicopter to fixed-wing aircraft simulation through software changes only; (3) interfaces with an advanced cab/visual display system of large dimensions; (4) makes maximum use of proven techniques, convenient materials and off-the-shelf components; (5) will operate within the existing building envelope without modifications; (6) can be built within the specified weight limit and avoid compromising VMS performance; (7) provides maximum performance with a minimum of power consumption; (8) simple design minimizes coupling between motions and maximizes reliability; and (9) can be built within existing budgetary figures.

  3. Duality quantum algorithm efficiently simulates open quantum systems

    PubMed Central

    Wei, Shi-Jie; Ruan, Dong; Long, Gui-Lu

    2016-01-01

    Because of inevitable coupling with the environment, nearly all practical quantum systems are open system, where the evolution is not necessarily unitary. In this paper, we propose a duality quantum algorithm for simulating Hamiltonian evolution of an open quantum system. In contrast to unitary evolution in a usual quantum computer, the evolution operator in a duality quantum computer is a linear combination of unitary operators. In this duality quantum algorithm, the time evolution of the open quantum system is realized by using Kraus operators which is naturally implemented in duality quantum computer. This duality quantum algorithm has two distinct advantages compared to existing quantum simulation algorithms with unitary evolution operations. Firstly, the query complexity of the algorithm is O(d3) in contrast to O(d4) in existing unitary simulation algorithm, where d is the dimension of the open quantum system. Secondly, By using a truncated Taylor series of the evolution operators, this duality quantum algorithm provides an exponential improvement in precision compared with previous unitary simulation algorithm. PMID:27464855

  4. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    NASA Astrophysics Data System (ADS)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging activities. The method results in the development of LULC maps providing insights into a range of alternative futures using a scope of socio-economic and environmental conditions. A landslides assessment model, the ALICE model is then used as a final tool to analyze the potential impacts of simulated LUCC on landslide risks and the consequences in terms of vulnerability, e.g. changes in disaster risk allocation or characterization, degree of perturbation. This assessment intends to provide insights onto the potential future development of the valley to help identify areas at stake and to guide decision makers to help the risk management. Preliminary results show strong differences of futures land use and land cover maps that have significant influence on landslides hazards.

  5. Quality assurance study of caries risk assessment performance by clinical faculty members in a school of dentistry.

    PubMed

    Rechmann, Peter; Featherstone, John D B

    2014-09-01

    The goal of this quality assurance study was to explore the decision making of clinical faculty members at the University of California, San Francisco School of Dentistry predoctoral dental clinic in terms of caries risk level assignment using the caries risk assessment (CRA) as part of the Caries Management by Risk Assessment (CAMBRA) concept. This research was done in part to determine if additional training and calibration were needed for these faculty members. The study tested the reliability and reproducibility of the caries risk levels assigned by different clinical teachers who completed CRA forms for simulated patients. In the first step, five clinical teachers assigned caries risk levels for thirteen simulated patients. Six months later, the same five plus an additional nine faculty members assigned caries risk levels to the same thirteen simulated and nine additional cases. While the intra-examiner reliability with weighted kappa strength of agreement was very high, the inter-examiner agreements with a gold standard were on average only moderate. In total, 20 percent of the presented high caries risk cases were underestimated at caries levels too low, even when obvious caries disease indicators were present. This study suggests that more consistent training and calibration of clinical faculty members as well as students are needed.

  6. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    NASA Astrophysics Data System (ADS)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  7. The International year of soils: thoughts on future directions for experiments in soil erosion research

    NASA Astrophysics Data System (ADS)

    Kuhn, Nikolaus J.

    2015-04-01

    The 2015 UN Year of Soils (IYS), implemented by the FAO, aims to increase awareness and understanding of the importance of soil for food security and essential ecosystem functions. The IYS has six specific objectives, ranging from raising the awareness among civil society and decision makers about the profound importance of soils, to the development of policies supporting the sustainable use of the non-renewable soil resource. For scientists and academic teachers using experiments to study soil erosion processes, two objectives appear of particular relevance. First is need for the rapid capacity enhancement for soil information collection and monitoring at all levels (global, regional and national). While at first glance, this objective appears to relate mostly to traditional mapping, sampling and monitoring, the threat of large-scale soil loss, at least with regards to their ecosystem services, illustrates the need for approaches of studying soils that avoids such irreversible destruction. Relying on often limited data and their extrapolation does not cover this need for soil information because rapid change of the drivers of change itself carry the risk of unprecedented soil reactions not covered by existing data sets. Experiments, on the other hand, offer the possibility to simulate and analyze future soil change in great detail. Furthermore, carefully designed experiments may also limit the actual effort involved in collecting the specific required information, e.g. by applying tests designed to study soil system behavior under controlled conditions, compared to field monitoring. For rainfall simulation, experiments should therefore involve the detailed study of erosion processes and include detailed recording and reporting of soil and rainfall properties. The development of a set of standardised rainfall simulations would widen the use data collected by such experiments. A second major area for rainfall simulation lies in the the education of the public about the crucial role soil plays in food security, climate change adaptation and mitigation, essential ecosystem services, poverty alleviation and sustainable development. While erosion monitoring and modeling, as well as erosion risk assessment maps provide a solid foundation for decision makers, the attention of the public for "dirt" is often much easier to achieve by setting up a rainfall simulation experiment that illustrates the connection between a process, such as rainfall and runoff observed in daily life, and its causes and consequences. Exploring the potential of rainfall simulation experiments as an outreach tool should therefore be part of the soil science, geomorphology and hydrology community during the IYS 2015 and beyond.

  8. The development of a simulation model of primary prevention strategies for coronary heart disease.

    PubMed

    Babad, Hannah; Sanderson, Colin; Naidoo, Bhash; White, Ian; Wang, Duolao

    2002-11-01

    This paper describes the present state of development of a discrete-event micro-simulation model for coronary heart disease prevention. The model is intended to support health policy makers in assessing the impacts on health care resources of different primary prevention strategies. For each person, a set of times to disease events, conditional on the individual's risk factor profile, is sampled from a set of probability distributions that are derived from a new analysis of the Framingham cohort study on coronary heart disease. Methods used to model changes in behavioural and physiological risk factors are discussed and a description of the simulation logic is given. The model incorporates POST (Patient Oriented Simulation Technique) simulation routines.

  9. Scattering of sound by atmospheric turbulence predictions in a refractive shadow zone

    NASA Technical Reports Server (NTRS)

    Mcbride, Walton E.; Bass, Henry E.; Raspet, Richard; Gilbert, Kenneth E.

    1990-01-01

    According to ray theory, regions exist in an upward refracting atmosphere where no sound should be present. Experiments show, however, that appreciable sound levels penetrate these so-called shadow zones. Two mechanisms contribute to sound in the shadow zone: diffraction and turbulent scattering of sound. Diffractive effects can be pronounced at lower frequencies but are small at high frequencies. In the short wavelength limit, then, scattering due to turbulence should be the predominant mechanism involved in producing the sound levels measured in shadow zones. No existing analytical method includes turbulence effects in the prediction of sound pressure levels in upward refractive shadow zones. In order to obtain quantitative average sound pressure level predictions, a numerical simulation of the effect of atmospheric turbulence on sound propagation is performed. The simulation is based on scattering from randomly distributed scattering centers ('turbules'). Sound pressure levels are computed for many realizations of a turbulent atmosphere. Predictions from the numerical simulation are compared with existing theories and experimental data.

  10. Simulation based planning of surgical interventions in pediatric cardiology

    NASA Astrophysics Data System (ADS)

    Marsden, Alison L.

    2013-10-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.

  11. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  12. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-06-01

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  13. Novel harmonic regularization approach for variable selection in Cox's proportional hazards model.

    PubMed

    Chu, Ge-Jin; Liang, Yong; Wang, Jia-Xuan

    2014-01-01

    Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2 < q < 1) regularizations, to select key risk factors in the Cox's proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL), the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  14. Control and manipulation of antiferromagnetic skyrmions in racetrack

    NASA Astrophysics Data System (ADS)

    Xia, Haiyan; Jin, Chendong; Song, Chengkun; Wang, Jinshuai; Wang, Jianbo; Liu, Qingfang

    2017-12-01

    Controllable manipulations of magnetic skyrmions are essential for next-generation spintronic devices. Here, the duplication and merging of skyrmions, as well as logical AND and OR functions, are designed in antiferromagnetic (AFM) materials with a cusp or smooth Y-junction structures. The operational time are in the dozens of picoseconds, enabling ultrafast information processing. A key factor for the successful operation is the relatively complex Y-junction structures, where domain walls propagate through in a controlled manner, without significant risks of pinning, vanishing or unwanted depinning of existing domain walls, as well as the nucleation of new domain walls. The motions of a multi-bit, namely the motion of an AFM skyrmion-chain in racetrack, are also investigated. Those micromagnetic simulations may contribute to future AFM skyrmion-based spintronic devices, such as nanotrack memory, logic gates and other information processes.

  15. Tin Whisker Electrical Short Circuit Characteristics Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim J.; Asfour, Shihab S.; Bayliss, Jon A.; Ludwib, Lawrence L.; Zapata, Maria C.

    2007-01-01

    Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that has a currently unknown probability associated with it. Due to contact resistance electrical shorts may not occur at lower voltage levels. In this experiment, we study the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From this data we can estimate the probability of an electrical short, as a function of voltage, given that a free tin whisker has bridged two adjacent exposed electrical conductors. In addition, three tin whiskers grown from the same Space Shuttle Orbiter card guide used in the aforementioned experiment were cross-sectioned and studied using a focused ion beam (FIB).

  16. Coexistence and specialization of pathogen strains on contact networks.

    PubMed

    Eames, Ken T D; Keeling, Matt J

    2006-08-01

    The coexistence of different pathogen strains has implications for pathogen variability and disease control and has been explained in a number of different ways. We use contact networks, which represent interactions between individuals through which infection could be transmitted, to investigate strain coexistence. For sexually transmitted diseases the structure of contact networks has received detailed study and has been shown to be a vital determinant of the epidemiological dynamics. By using analytical pairwise models and stochastic simulations, we demonstrate that network structure also has a profound influence on the interaction between pathogen strains. In particular, when the population is serially monogamous, fully cross-reactive strains can coexist, with different strains dominating in network regions with different characteristics. Furthermore, we observe specialization of different strains in different risk groups within the network, suggesting the existence of diverging evolutionary pressures.

  17. Cryptanalysis and security enhancement of optical cryptography based on computational ghost imaging

    NASA Astrophysics Data System (ADS)

    Yuan, Sheng; Yao, Jianbin; Liu, Xuemei; Zhou, Xin; Li, Zhongyang

    2016-04-01

    Optical cryptography based on computational ghost imaging (CGI) has attracted much attention of researchers because it encrypts plaintext into a random intensity vector rather than complexed-valued function. This promising feature of the CGI-based cryptography reduces the amount of data to be transmitted and stored and therefore brings convenience in practice. However, we find that this cryptography is vulnerable to chosen-plaintext attack because of the linear relationship between the input and output of the encryption system, and three feasible strategies are proposed to break it in this paper. Even though a large number of plaintexts need to be chosen in these attack methods, it means that this cryptography still exists security risks. To avoid these attacks, a security enhancement method utilizing an invertible matrix modulation is further discussed and the feasibility is verified by numerical simulations.

  18. A reaction-diffusion within-host HIV model with cell-to-cell transmission.

    PubMed

    Ren, Xinzhi; Tian, Yanni; Liu, Lili; Liu, Xianning

    2018-06-01

    In this paper, a reaction-diffusion within-host HIV model is proposed. It incorporates cell mobility, spatial heterogeneity and cell-to-cell transmission, which depends on the diffusion ability of the infected cells. In the case of a bounded domain, the basic reproduction number [Formula: see text] is established and shown as a threshold: the virus-free steady state is globally asymptotically stable if [Formula: see text] and the virus is uniformly persistent if [Formula: see text]. The explicit formula for [Formula: see text] and the global asymptotic stability of the constant positive steady state are obtained for the case of homogeneous space. In the case of an unbounded domain and [Formula: see text], the existence of the traveling wave solutions is proved and the minimum wave speed [Formula: see text] is obtained, providing the mobility of infected cells does not exceed that of the virus. These results are obtained by using Schauder fixed point theorem, limiting argument, LaSalle's invariance principle and one-side Laplace transform. It is found that the asymptotic spreading speed may be larger than the minimum wave speed via numerical simulations. However, our simulations show that it is possible either to underestimate or overestimate the spread risk [Formula: see text] if the spatial averaged system is used rather than one that is spatially explicit. The spread risk may also be overestimated if we ignore the mobility of the cells. It turns out that the minimum wave speed could be either underestimated or overestimated as long as the mobility of infected cells is ignored.

  19. Transformation and Stability of Dimethylmonothiolated Arsinic acid (DMMTAV) and Dimethyldithiolated Arsinic Acid (DMDTAV) in a Simulated Landfill Leachate

    NASA Astrophysics Data System (ADS)

    Yoon, H. O.; Lee, H.; Jeong, S.

    2016-12-01

    In environmental pollution concern, arsenic species (As) are the major concern because of its toxicity. The occurrence of thioarsenates, thiolated analogs of inorganic As species, are recently reported in groundwater, geothermal water, and landfill leachate. Dimethylmonothiolated arsinic acid (DMMTAV) and dimethyldithiolated arsinic acid (DMDTAV) have receiving increasing attention. Since there are difficulties of preparing of standards along with confirming DMMTAV and DMDTAV for verification prior to analysis of samples due to no available commercial standard, the accurate assessment of those As species was not resolved. is present and Moreover, there are limit studies on transformation and stability of thiolated As species under high sulfur condition such as landfill leachate to accurate assess their fate and toxicity in environment. In this study, DMMTAV and DMDTAV were artificially synthesized and identified using ESI-MS. Column test was conducted using the simulated landfill leachates (SLLs) to investigate their transformation under high sulfur conditions. The transformation mechanisms for DMMTAV and DMDTAV were also investigated to quantify what As species are existed and transformed in landfill leachate for determining their potential risk. The transformed As species were analyzed using high performance liquid chromatography (HPLC) coupled with inductively coupled plasma-mass spectrometry (ICP-MS). This study provides the transformation mechanism and stability of DMMTAV and DMDTAV in landfill leachate to determine their potential environmental risk. Acknowledgement: This research was supported by research project title "Development of response Technology for the Environment Disaster by Chemical Accident (project No. C36707) of the Korea Basic Science Institute.

  20. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    NASA Technical Reports Server (NTRS)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2010-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish .

Top